The Creating Safer Space Participatory Evaluation and Learning Methodology for UCP
Traditional methods of evaluation are often top-down or output-focused and therefore risk missing important nuances connected with civilian protection efforts. Aware of these sensitivities, the research network Creating Safer Space brought together a group of UCP practitioners and scholars in the “Innovation in Evaluation and Learning“ working group. This group discussed good practice approaches to evaluation and developed principles that should inform evaluation design. The following sets out the participatory approach to UCP evaluation that the group recommends.
Areas of evaluation
Based on the group’s discussions, evaluation of UCP activities should ascertain the following areas of evaluation:
- Do people feel safer? If so, how? How do they know they feel that way? What is it based on or linked to?
- Do they think UCP/A (or another intervention) has made a difference to their safety? If so, what has made the difference and how (specific practices, people, activities)?
- Has UCP introduced new or different roles and narratives in relation to safety in the community?
- Has there been change in relation to what people perceive to be safety in the community?
Guiding principles
The group identified four principles, which should guide participatory evaluations of UCP/A:
- Participatory:The evaluation design needs to have meaningful input from communities, partners and/or stakeholders so as to reflect language and context. Optimally, questions are developed with local communities, partners, and/or stakeholders.
- Organisation- or community-internal:Those conducting the evaluation are internal to the organization or community whose protection efforts they are evaluating, rather than being external evaluators.
- Semi-structured conversations:Evaluations with individuals or groups should be done through iterative, open-ended questions with room for follow-up questions. Informal conversations in natural settings are useful where possible, as are observations from the field and – where they can support participatory methods – desk reviews of existing documentation.
- Ongoing reflection on process:Questions and interactions should be constantly reviewed in light of feedback from participants. Evaluators should work with participants to find correct phrasing and best forms of interaction, and should think about “evaluating the evaluation” during the lifetime of the process.
Evaluation process
In terms of evaluation process, the Innovation in Evaluation and Learning group identified the following as important steps:
- Generating data: Begin by asking people open questions, then follow up with further questions to explore topics in more depth.
- For example, do not only ask if respondents have noticed change, but how they notice this change and what they perceive that change to be linked to (ask what, why, how, when questions).
- Collate demographic and context data (if possible/safe to do so), in order to be able to disaggregate findings at the analysis stage (see step 2 below).
- Analysing data: Analyse data to look for patterns and/or emphasis through thematic analysis:
- Look for themes that come up time and again.
- Look for silences in the data: Are there any themes that are missing from the data although you anticipated them? What could be the reasons?
- Look to see if data aligns with certain project goals, research or funding agendas.
- Disaggregate data to see if there are differences between different groups (gender, age, abilities, identities, etc.), or different geographies (urban-rural, different parts of town, etc.). To be able to do so, this type of data generation needs to be embedded into the evaluation process (see step 1 above).
- Verifying findings: After analysing the data, discuss the findings with partners and people in the community to ensure the themes identified are reflective of what people were trying to communicate.
- Documenting the results: When writing a report about your evaluation results, include a paragraph explaining the methodology. This will help others in your organisation and/or funders to appreciate how you arrived at the results and how valid they are.
- Reflecting on the evaluation process: Reflect on the usefulness and validity of the exercise and identify how it could be improved in the next evaluation.
If you have used this methodology, please get in touch and let us know how it worked for you!