Pilot Projects: Innovation in Evaluation and Learning in Unarmed Civilian Protection and Accompaniment (UCP/A)
The AHRC-GCRF Network Plus Creating Safer Space welcomes applications for pilot projects of participatory evaluation and learning related to Unarmed Civilian Protection and Accompaniment (UCP/A). We offer two types of funding:
- Type 1 Funding: We will fund up to three small projects that meet the criteria set out in this Call for Applications and are led by an organisation that is registered in a Low and Middle Income Country (LMIC) or which has a country/field office in an LMIC (see eligibility criteria for further details). Grants per project range from £3,000 to max. £4,500 GBP, and the total amount available under this call is £9,500 GBP.
- Type 2 Funding: We will fund one small project that meets the criteria set out in this Call for Applications and whose Lead Organisation is a US 501 (c) (3) non-profit organization that can receive funding from a Donor-Advised Fund (DAF) (see eligibility criteria for further details). The grant is up to max. $4,000 USD.
The application deadline is 12 June 2023.
Below you find the following information:
- Background to the Call
- The CSS Participatory Evaluation and Learning Methodology
- Eligible applicants
- Eligible costs
- Assessment process
- Evaluation criteria
- Timeline
You can also download a PDF of the Call here: Call for Applications – Evaluation pilot projects_final.
Please use this Application Form to apply for the grants: Application form – Evaluation pilot projects_final.
Background to the Call
Creating Safer Space is aware of the sensitivities connected with UCP/A and how traditional methods of evaluation are often top-down or output-focused and therefore risk missing important nuances connected with civilian protection efforts. Over the past few months, a group of practitioners and scholars have therefore discussed good practice approaches to evaluation, and developed principles that should inform evaluation design.
This Innovation in Evaluation and Learning group has developed a participatory approach to UCP evaluation and now seeks to fund projects that will pilot the methodology set out below.
The CSS Participatory Evaluation and Learning Methodology
Based on the group’s discussions, evaluation of UCP/A activities under this call should ascertain the following areas of evaluation:
- Do people feel safer? If so, how? How do they know they feel that way? What is it based on or linked to?
- Do they think UCP/A (or another intervention) has made a difference to their safety? If so, what has made the difference and how (specific practices, people, activities)?
- Has UCP introduced new or different roles and narratives in relation to safety in the community?
- Has there been change in relation to what people perceive to be safety in the community?
Four principles for participatory evaluation projects were identified, which should guide the pilot projects under this call:
- Participatory: The evaluation project design needs to have meaningful input from communities, partners and/or stakeholders so as to reflect language and context. Optimally, questions are developed with local communities, partners, and/or stakeholders.
- Organisation or community-internal: Those conducting the evaluation are internal to the organization or community whose protection efforts they are evaluating, rather than being external evaluators.
- Semi-structured conversations: Evaluations with individuals or groups should be done through iterative, open-ended questions with room for follow-up questions. Informal conversations in natural settings are useful where possible, as are observations from the field and – where they can support participatory methods – desk reviews of existing documentation.
- Ongoing reflection on process: Questions and interactions should be constantly reviewed in light of feedback from participants. Evaluators should work with participants to find correct phrasing and best forms of interaction, and should think about “evaluating the evaluation” during the lifetime of the process.
In terms of research design, the Innovation in Evaluation and Learning group identified the following as important steps:
- Generating data: Begin by asking people open questions, then follow up with further questions to explore topics in more depth.
- For example, do not only ask if respondents have noticed change, but how they notice this change and what they perceive that change to be linked to (ask what, why, how, when questions).
- Collate demographic and context data (if possible/safe to do so), in order to be able to disaggregate findings at the analysis stage (see step 2 below).
- Analysing data: Analyse data to look for patterns and/or emphasis through thematic analysis:
- Look for themes that come up time and again.
- Look for silences in the data: Are there any themes that are missing from the data although you anticipated them? What could be the reasons?
- Look to see if data aligns with certain project goals, research or funding agendas.
- Disaggregate data to see if there are differences between different groups (gender, age, abilities, identities, etc.), or different geographies (urban-rural, different parts of town, etc.). To be able to do so, this type of data generation needs to be embedded into the evaluation process (see step 1 above).
- Verifying findings: After analysing the data, discuss the findings with partners and people in the community to ensure the themes identified are reflective of what people were trying to communicate.
- Reflecting on the evaluation process: Given that these are pilot projects, successful applicants should reflect on the usefulness and validity of the exercise, and identify how it could be improved.
- Documenting the pilot project: The expectation is that successful applicants write a brief report on their experience of using the methodology. This report does not need to include any specific participant- or case-related data (where this is not safe or ethical to do), but it should reflect on high-level learning and include reflections on the Participatory Evaluation and Learning Methodology proposed here.
Eligible applicants
Due to the nature of funding available for this Call for Applications, only the following organisations are eligible:
Type 1 funding – eligibility criteria:
- The Lead Applicant (LA)has overall responsibility for the project and leads the evaluation pilot. The LA must be employed by an organisation with capacity to support the project. This will be the Lead Organisation on the project.
- The Lead Organisation must be based in a Low and Middle Income Country (LMIC) on the OECD Development Assistance Committee (DAC) List of Official Development Assistance Recipients(DAC list). There are some exceptions: organisations in China, India, or in countries due to graduate from the DAC list are not
- If an Organisation’s headquarters is not based in an LMIC, but the organisation has a Country or Field Office is in an LMIC, then this LMIC-based Country or Field Office is eligible to apply.
Not eligible to apply for this Type 1 Funding are:
- Organisations not based in an LMIC;
- Independent / freelance researchers (these can be hired by the Lead Organisation as researchers or research assistants on the pilot project, but they cannot directly apply for funding).
Type 2 funding – eligibility criteria:
- The Lead Applicant (LA)has overall responsibility for the project and leads the evaluation pilot. The LA must be employed by an organisation with capacity to support the project. This will be the Lead Organisation on the project.
- The Lead Organisation must be a US 501 (c) (3) non-profit organization that can receive funding from a Donor-Advised Fund (DAF).
Not eligible to apply for this Type 2 Funding are:
- Organisations which are not a US 501 (c) (3) non-profit organization;
- Independent / freelance researchers (these can be hired by the Lead Organisation as researchers or research assistants on the pilot project, but they cannot directly apply for funding).
Eligible costs
Projects are welcome to apply for the funding required to support the proposed evaluation activities within the funding limits. These are:
- Type 1 funding: £3,000 to max. £4,500 GBP per project. Eligible costs will be covered at 100%.
- Type 2 funding: Up to max. $4,000 USD. Eligible costs will be covered at 100%.
Examples of eligible costs include:
- Funding to cover the time that Lead Applicant and Team Members commit to the project.
- Employment of a Research Assistant to help with the research.
- Travel and subsistence for co-creation of evaluation tools, evaluation activities, and validation activities (e.g., fieldtrips, workshops, interviews).
- Consumables for evaluation activities (e.g., materials necessary to undertake creative reflection methods).
- Translation or interpretation in relation to the evaluation pilot.
Applicants are required to demonstrate that any resources requested are reasonable in the context of the proposed pilot project.
Assessment process
The Creating Safer Space network is committed to assessing grant applications fairly, to ensure the best projects are funded. When you have submitted your application, the Creating Safer Space administrative team will check the application to ensure it meets the stated criteria (e.g., in respect to eligibility, allowable costs, etc.).
All applications that meet the funding criteria will then be reviewed by an Assessment Panel. Each panel member will give the application a score between 1 and 6, where 1 is ‘unfundable’ and 6 is ‘exceptional’, and the scores will be discussed to agree on the grade and rank the applications. The final funding decision based on this ranking is made by the Executive Committee of the Creating Safer Space network. The Executive Committee will not re-rank the proposals, but it will decide on how many projects the network can fund based on the funding requested, and monitor the portfolio of projects for regional spread.
Evaluation criteria
Applications will be evaluated against the following Evaluation Criteria:
- Eligibility: All applications will be checked with regard to their formal eligibility.
- Pilot Project Quality and Importance: Especially the extent to which the proposal meets the specific aims of the Creating Safer Space network with regard to piloting the CSS Participatory Evaluation and Learning Methodology framework; the appropriateness of the suggested evaluation design and methods; and the quality of the participatory element of the evaluation.
- People: Including the quality of the evaluation team; the quality of the participatory element of co-creation; and evidence of the ability of the evaluation team to bring the project to completion.
- Management: Including the appropriateness of the project plan and timeframes; and the extent to which questions of security, safeguarding, ethics, and data management have been considered.
- Value for Money: The extent to which the resources requested are reasonable in the context of the project.
Timeline
Pilot projects are asked to adhere to the following timeline:
- Application deadline: 12 June 2023
- Funding decision: 1 July 2023
- Evaluations to take place: 15 July – 31 August 2023
- Project reports due: 15 September 2023