When the site first went live I judged everything by numbers. Unique visitor numbers, page views, votes and above all else comments. I would login regularly and worry when online stakeholder consultations did not have large numbers of comments and an active discussion emerging.
As time has gone on I have become less sure of this indicator of success. It is always good to see an active discussion going on, that is what we developed Bang the Table for, but in many cases masses of comments are not what our clients are looking for and are really not an indicator of success.
Take two recent Local Government online stakeholder consultations as an example. The first, a consultation on a Council Management Plan had attracted just a handful of comments when I ran into the General Manager of the Council at a function. I was feeling rather sheepish about the low response rate. He had a totally different take on the situation.
In the previous year, the Council in question had held 10 public meetings to consult the community on their management plan. Their total attendees to all the events were (I am told) less than 50. I have no idea what this would have cost but I would guess about $20,000 when venue hire, facilitation and staff time are taken into account. The General Manager pointed out to me that this year in 1 week the site had been visited by over 100 different people who had left a smattering of comments. What did this stakeholder consultation cost? $1000.
So although we had only a few comments the online stakeholder consultation had reached more people than a more traditional consultation at a fraction of the cost.
The second example is a online stakeholder consultation on an area planning strategy. It attracted very few comments and I agonised over this. Usually, when this happens it is down to the site not being well promoted but this page was attracting regular visitors. The visitors were clearly interested in the subject as they were downloading an average of more than one document each from the library, a very high rate in comparison to other stakeholder consultations. I was left to conclude that there was nothing wrong with this consultation. People are just comfortable with the strategy proposed and see no need to argue. On checking with Council this aligned well with their conclusions from public meetings.
If people have a chance to comment but do not then that alone can be an indicator of community satisfaction. Though it is important not to interpret silence as support, I believe that we can accurately interpret silence as indifference in circumstances where there are:
- healthy visitor numbers
- high numbers of document downloads
- few people leaving comments or votes; and
- where traditional consultation methods (meetings, submissions and letters) also follow a low response pattern.
We know that when people feel passionate they use the site to comment in their hundreds. If they do not and the site is being visited it seems fair to argue that this reflects a lack of passion.
In terms of many bureaucratic consultations, this reflects a job well done. As citizens, we rarely get excited in support of government institutions but we are quick to react if they get things wrong. So, in some cases, silence can be interpreted as indifference which can be a successful outcome in itself.
Photo Credits: Halina Ward
Thanks for getting all the way to bottom! Subscribe to our monthly digest newsletter if you’d like to be kept up to date about community engagement practice globally. Take a look at our two product websites: EngagementHQ if you need a complete online engagement solution, and BudgetAllocator if you need a participatory budgeting solution. Or get in touch if you have a story idea you think is worth sharing.