By Dan James, INTRAC Senior Research Consultant.
The case for building a better evidence base for interventions in poverty reduction, health, education, humanitarian work and other arenas has been persuasively argued over the last few years (albeit with legitimate pushback against the more gung-ho advocates of what is often perceived as a donor driven results agenda). Evidence, particularly of effectiveness and impact, is now a focus of major aid donors, both public and private. A recent ALNAP webinar also illustrated how some of the larger international NGOs (INGOs) such as Oxfam and IRC are attempting to bring evidence systematically into their programming.
However, less thought seems to have been given to the kinds of evidence civil society organisations (CSOs), often smaller and working from the bottom-up, are working with and how it can inform better development practice. One reason is that ‘evidence’ is very often conceived of in the context of government policy or large-scale programmes; hence the term ‘evidence-based policy’. This is perhaps why CSOs and NGOs are sometimes perceived as lagging behind the drive for better evidence-based policy and practice. Is this the case? Why should NGOs look to strengthen their use of evidence, and how can this be achieved in practice?
CSOs and NGOs are incredible hubs of knowledge and evidence of certain kinds. They are often the actors closest to the ‘ground’ and many invest significant resources in finding out what people’s needs and priorities are. This deep knowledge of the contexts in which they work means they are working with evidence. But it is often evidence of the relevance (in the DAC terminology) of development programmes to intended beneficiaries and contexts, rather than the effectiveness of interventions. Both dimensions are critical to good development practice.
CSOs and NGOs also play a particular role in the development of new approaches, often in the early stages of an innovation process – identifying problems, generating ideas and prototyping and piloting a diverse range of solutions. Thinking about CSOs and NGOs as innovators highlights the needs to engage with evidence and knowledge. In theory, this deep knowledge of contexts and lessons from experimenting with different approaches could be used by the bigger, slower moving institutions working at larger scales. However, without understanding what has already been tried, organisations may find themselves reinventing the wheel, or repeating past mistakes rather than innovating.
Work by INTRAC back in 2013 suggested that civil society organisations were struggling to engage with research and evidence. There is little to suggest that, beyond the biggest players, this has changed. So how can CSOs and NGOs engage more with evidence? A new book, Negotiating Knowledge explores how NGOs are working with evidence in more detail, how NGOs understand evidence, and the power-dynamics of evidence production. Below, I illustrate the four propositions that are advanced in the concluding chapter with some examples from INTRAC’s recent work and experiences.
First, civil society organisations themselves need to define what evidence is needed and why. For smaller organisations, simply reviewing the evidence that they are currently using and producing is a critical first step. Standards of evidence, such as those developed by BOND or NESTA, can support this. However, we should remember that the evidence debate is not neutral and it is often the voices of donors and policy-makers in the Global North that determine what is expected. Instead civil society organisations need to engage on their own terms. INTRAC is seeing more organisations who are doing this. An example is how ActionAid is seeking to engage with the Value for Money agenda: focusing on what is useful to their decision-making and how that aligns with organisational values, rather than simply what is proposed by donors.
Second, make space to assess the value attached to evidence, knowledge and learning. Some, particularly the larger INGOs, have well-developed frameworks articulating the value of evidence and learning. However, in practice, many organisations are very focused on implementation, and carving out time to assess evidence and learn from programmes (and others!) can be difficult. INTRAC itself has struggled with this in the past. This year we have explicitly budgeted for learning and outreach activities as an organisation allocating time and resources to staff.
Third, invest in skills and capacity for engaging with evidence and knowledge. This might be done by investing in capacity across a wider staff group, but also by bringing in specialist skills. As a provider of specialist skills, such as monitoring and evaluation, INTRAC is involved a lot with the latter. But we also see growing demands for more generalist staff within organisations to understand and engage with evidence, particularly as the complexity of programmes and demands from donors/upstream partners for evidence grow. If evidence generation is always outsourced, there is a risk that it becomes a silo separated from the “doers” in an organisation. This is one of the reasons we are offering training to develop skills in using evidence across many aspects of organisations’ work.
Fourth, there needs to be greater openness to alternative views of evidence and knowledge. My colleague Anne Garbutt describes how participatory monitoring and evaluation has fallen from fashion over the last decade (in part because of a perceived lack of rigour) only to be re-discovered in the form of beneficiary feedback. We are not alone in arguing that there is no simple answer as to what counts as good evidence. Ultimately, if the debate becomes too focused on using certain forms of evidence, we may reduce the space for learning rather than expand it.
We see these propositions as informing our work, particularly in developing the capacity of CSOs to use evidence. Yet there are still a lot of questions to answer, about the distinctive needs of NGOs and CSOs regarding evidence. These include what evidence is appropriate for what kinds of intervention, whether the push for evidence is too orientated toward larger organisations, or whether the balance is right between generating evidence and using it.
A final question concerns the language we use to discuss this: the term ‘evidence’ itself does not always adequately cover the issues of rigour, learning, innovation, the building of business cases, and demonstrating results.
We hope to open up further discussion of these over the coming months.