dialogue by design
people image 1people image 2people image 3people image 4people image 5people image 6
   Home    Bitcoin Gemini   Current consultations Contact us Search  
 
 

Who we are

Products and services

Approach

Clients

Press room

Current newsletter

Article index

Resources

Demonstration video

Site map

Newsletter, March 2007

previous newsletter next newsletter

Contents this month


Evaluating engagement processes

One way to reduce cynicism about engagement processes, as well as to ensure they are as good as they can be, is to evaluate properly what works and what does not.

Evaluation from different perspectives

Processes need to be evaluated from two perspectives: the organiser’s and the participants’. Both are essential to determining how successful a process was.

To gain the participants’ perspective it is useful to ask questions around:

  • their understanding of the exercise’s purpose
  • how easy or difficult they found it to respond
  • whether they felt the process enabled them to express their own views clearly
  • how confident they felt that their contributions would be appreciated and used.

These questions enable the organiser to find out how satisfied participants were with the methods used and whether they felt the process genuinely gave them an opportunity to contribute to the topic being consulted on.

From the organisers perspective the questions need to be a little different:

  • how effective the methods were in eliciting the participants’ view
  • the usefulness of responses received
  • the level and type of participation achieved
  • costs and value for money
  • learning points for the next time.

After all, there is no point in employing even the most sophisticated engagement method if the process gets in the way of participants giving their views or produces responses that cannot be used.

Following up evaluation

Doing an evaluation is only the first step. The next is to take account of the feedback and make sure the next engagement process uses everything you have learned from the previous. The aim should be to create a culture of incremental improvement so that every process is better than the last one.

The case study below shows the results of how an evaluation can be used in practice.

Case Study: Sustainable Development Panel

The Sustainable Development Commission’s (SDC) Panel of 600 began work in September 2006. The first topic on which the SDC consulted them was ‘Redefining Progress’, focusing on economic growth, people’s happiness and overall wellbeing.

There were three stages to the process. Stage 1 posed a series of questions to obtain the Panel’s views on the very broad concept of economics and wellbeing. Stage 2 enabled Panel members to read each others’ responses and answer a second set of questions aimed at clarifying and prioritising points made in Stage 1. Stage 3 then provided the Panel with the results of the earlier stages and an evaluation form for members to provide feedback on the consultation process. 147 of the Panel members completed the form either in part or full.

The evaluation, which was more extensive than usual for an online process, was divided into a series of sections with questions relating to:

  • ease of use
  • overall consultation process issues
  • accessibility
  • political engagement.

The feedback, much summarised here, has been of use to us in understanding how participants found using the technology provided for the consultation process.

Ease of use

The majority of people found the consultation either quite or very easy to use. Comments about the structure and navigation of the site included:

… the navigation of the site was very easy. I found navigating around it very intuitive. I think this is one instance of “if it’s not broke, don’t fix it!”

I found the site logically arranged and easy to navigate round. The background information was good and comprehensive.

I think most people who have agreed to be panel members appreciate that commitment is required. I found that there was the opportunity to be brief or expand. However, the structure leads you to provide more lengthy answers, which from your perspective can be a nightmare to collate.

Some participants did request that a PDF version of the questions be available to download and read offline first so that they could then come back to the consultation website and submit their answers at a later point. This is a perfect example of the sort of invaluable feedback that evaluation can provide and be acted upon immediately.

Overall consultation process issues

To ensure that future consultations are clearly understood, we asked a series of questions about the topic, the objectives, the layout of the questions and supporting information.

Most Panel members found the objectives of the consultation to be fairly clear and many found the first consultation topic to be interesting.

The questions asked were very big and thought provoking, which at first was a little daunting… However, I enjoyed the challenge! As a result, I appreciated the length of time available to respond and the ability to return to the consultation to make amendments, which enabled me to take time to think about the questions and my own thoughts on such important issues.

I think the volume of response demonstrates that the process was relatively simple and easy to follow. It has not been particularly demanding or difficult and I have enjoyed being a part of the consultation.

Panel members, however, were divided as to whether they had too little or too much time to complete the consultation: some talked about each session being open for long enough but the actual answering of questions taking too long. This is the sort of feedback that is harder to use: we will probably maintain the session length but see if it is possible to streamline questions.

Another aspect of the consultation that produced mixed opinions was the discussion forum that ran in parallel with the structured process. Most participants did not use it, mainly due, they said, to lack of time. So the value of such a forum will need to be examined further; the forum might, for instance, be more useful on its own rather than in combination with a more formal consultation process.

Accessibility

When we originally launched the Panel we wanted to make it as accessible as possible to everyone within the UK. One of the issues that we asked in the evaluation process was whether we were right to translate the primary consultation material into Welsh.

Overall, Panel members appreciated being asked this question and their comments generally questioned the logic of only providing one alternative language. They offered a number of possible solutions, such as:

… How about producing it in English only but writing a note in Welsh to say that you’d provide a Welsh version for anyone that needs it on request. You could do the same for large print, audio or other languages.

… giving up [translation into other languages] is not the answer but more encouragement and promoting, perhaps through an informal forum apart from the main one, just to be able to start debating in Welsh [or other languages]?

This shows the value of asking the people who spot the problems to come up with potential solutions.

Political engagement

Dialogue by Design does not usually ask ‘political’ questions as part of evaluation, but in this case the Hansard Society, an independent, non-partisan charity that operates across the political spectrum, was keen to use the opportunity to provide insight into people’s experiences of online political engagement.

The frequency of internet use, the type of sites visited and whether or not someone kept their own blog or runs a discussion board were just some of the questions. Interestingly, most Panel members who took part in the evaluation believed that their involvement in political processes can make a difference. Almost all thought that participating in online consultations is a credible way of engaging with policy and many agreed that they would recommend it to others.

Conclusion

Feedback such as that last point above can also be very heartening. It reinforces our sense that the style of engagement process used for the Sustainable Development Panel is fundamentally worthwhile, even if there is still some room for improvement.

Evaluation is often portrayed as a negative process focused on identifying what went wrong. This is only half of it: the rest is about strengthening what is right and building on what works.

 

 
Register below to receive our newsletter
View a sample
 
Name
 
This is the name by which you will be addressed in the newsletter
 
 
E-mail