Project picture

Creating Evaluation Support for PE


Project Summary

The Manchester Beacon for Public Engagement programme had a complex evaluation framework, created to report to funders and to improve public engagement practice.

Project Partners

Suzanne Spicer
Public Engagement Manager

Benefits & Impact

  • 450 published guidance packs distributed.
  • Over 50 digital copies downloaded from the author’s own website.
  • Tested by 8 community groups; 3 universities and 1 museum.
  • Supported by 6 podcasts.

Impact on participants

  • Practical advice encouraged and inspired practitioners to carry out evaluation often using more creative and accessible data collection tools.
  • Developed a greater understanding of how evaluation can assess and improve the value of work as copies have been passed on between peers, and throughout full teams or departments.
  • Helped to plan and embed evaluation from the outset.
  • Provided a format for reporting to funders.
  • Become part of induction processes in external organisations (eg BBC).
  • Helped tutors advocate the case for and practicalities of evaluation to students.
  • Helped collate public and community feedback about the work of public engagement practitioners and improve two-way learning.


“It’s changed how I think about evaluation. I know it’s important to do it, how important it is to improve how we do things in the future. I think about evaluation from the outset now, we think about that from the start.” Practitioner


“The work done on evaluation has opened my eyes to the importance and professional ways we should be approaching evaluation” Senior manager

Lessons Learnt

  • Being clear about aims and objectives, and the reasons for developing the guide.
  • Using a development model of testing a version with academics, researchers, project managers and community groups.
  • Creating a flexible approach that can be adapted and work alongside other funders’ evaluation requirements.
  • Taking a people-centred approach and considering the different needs and preferences of people.
  • Making hard copies and digital versions available.
  • Providing examples of feedback forms and creative consultation tools.
  • Offering the pack out beyond the Manchester Beacon partners.
  • Offering complimentary support such as evaluation training, mentoring and podcasts.

Top Tips

What could be done differently?

  • Provide the guide as close to the start of a programme as possible.
  • Have a clear brief from the outset (ideas changed along the way and this delayed the guide’s development).

Policy recommendations

  • Establish evaluation as part of a culture of reflection and improvement by providing practical advice.
  • Plan to embed evaluation from the outset by including a request for evaluation plans within funding applications or strategic commissioning processes.

Evaluation indicators

  • 100% of project proposals to include evidence of evaluation planning.
  • Number of staff reporting feeling more supported to undertake public engagement activity.
  • Number of participants that are likely to take further action as a result of public engagement activity.
  • Increase in positive perception that university public engagement activity is important and relevant.
  • Improved external perception of partnership working with universities.

Resources and Links

For evaluation support see the links on page:


It was identified that the evaluators needed consistent information for analysis but it also became apparent that everyone involved had varying levels of existing evaluation knowledge - from well-established to none at all. Support was required in ways that were robust, accessible, complimented other evaluation requirements of other funders and programmes, and were reflective and realistic. Therefore, the emphasis moved from a series of fixed evaluation templates to a shared ethos, underpinned by a published evaluation guide.

Aims & Objectives

  • To track the progress of the Manchester Beacon programme to understand the benefits and challenges involved, and report findings to funders.
  • To influence improved public engagement practice based on analysis.
  • To support evaluation of public engagement practice in the Manchester Beacon partners and their community partners.
  • To support the sharing of learning across Beacon, community partners and the wider public engagement sector.


  • Being clear about aims and objectives, and the reasons for developing the guide.
  • Using a development model of testing a version with academics, researchers, project managers and community groups.