A look at when smart city adoption of technology might start to disengage the community.
The City of West Sacramento is trying to listen to their community more effectively by using an app to listen to publicly available social media posts. This helps them to identify and deal with issues in a more timely manner. The motivation here appears to be completely honorable – as Mayor Christopher Cabaldon states:
“It’s not that [the app] replaces our other forms of civic engagement, it’s just a way to listen more,”
But the endeavor has attracted some unfavorable media coverage nonetheless.
It is not hard to identify the need here. Cities see issues rise up very quickly in social media so having a system to forewarn them of issues that are gaining currency would doubtless be useful.
But increasingly, there is a negative reaction to this ‘surveillance’ of social media. Usually these concerns are couched in terms of privacy but our interest here is looking at the impact on engagement, the validity of data collected in this way and whether it makes the community feel more or less engaged.
What if use of artificial intelligence (AI) means that the community doesn’t know it is being engaged at all? This occurs already in some cases where cities are scanning social media for issues and I predict this will also happen as cities and other government organizations scan through the results of past engagement to extrapolate community needs and wants.
Stuart Reeve of Micromex Research has found that a resident’s satisfaction with their local council is intrinsically linked to the extent to which they feel engaged.
“across our analysis of over a dozen LGAs [Local Government Areas], we could see that the key drivers of overall satisfaction with Council was the content and scope of Council interaction with its residents. In fact, in most studies the community engagement variables contributed 20%-30% towards overall satisfaction. Community engagement is critical.”
Covert engagement, be it through scanning social media or extrapolating results, is unlikely to give a community member the sense of being listened to or of being part of a conversation. Cities getting their information this way will risk the community feeling excluded from the process simply because they are unaware they are being heard. This might see community cynicism about government increase despite decisions being backed by more not less community information.
The process is important for the community. It’s rare to reach consensus on any issue and the engagement process is often what signals to those who did not get their way that they were listened to. That the process was fair and reasonable.
Efficient as covert engagement might appear it is also worth asking questions about the quality of this approach. Is it possible that a person’s social media postings aimed at friends and family might be altogether more flippant and less well informed than statements they would choose to make in response to an enquiry for feedback from a government organization? Is it fair to take these statements out of context and to be ‘influenced’ by these as being community opinions? It seems that, at the least, this individual should be contacted to make them aware of the impact of their statements and to give them the chance to change them.
At a more fundamental level we need to ask whether our community deserves the right to provide input on specific issues rather than having their views assumed based on an algorithm.
It seems clear that AI will start to play a greater role in our civic engagement. This can be a good thing. A city like West Sacramento using AI to assist them in scanning issues arising in the community are unlikely to stop engaging the community on key decisions. Let’s face it, if the City wanted to do this they would hardly be likely to find a representative sample of the community chattering on social media about the issues they need to make decisions on:
“Hey mom did you hear the city is planning to upgrade ordinance 1256?”
“No son but I would be broadly supportive of that change so long as the height control isn’t varied”
Mayor Cabaldon’s description of ‘helping us to listen more’ is a really good one, it’s a measured extension of the city’s ability to know what their community are concerned about.
But if Moore’s law (that computer processing power doubles every 2 years) applies to AI these are issues we will have to think about very seriously in the coming years.