Blog post

Does Oxfam Help?

4 min read
11 Dec 2012

Oxfam is placing itself at the forefront of the push towards evidence-based charity. This is a very encouraging sign from one of the world's largest non-governmental organisations.

Figure

In general, one problem that charity evaluators like us encounter is that not enough studies have been done to measure the effect of different projects. The main reason cited by charities for this is that properly conducted Randomised Controlled Trials (RCTs) are prohibitively expensive. We are regularly asked what our view is of Oxfam, and have previously had to say that we just don't know, because not enough information is collected to assess their impact. This is why we are very glad to see Oxfam releasing data from its new Effectiveness Review.

The Effectiveness Review

In September 2011 Dr Karl Hughes published a blog post asking how charities could demonstrate they were being effective without bankrupting themselves. This year Oxfam answered his question by publishing the first results from a new "Effectiveness Review". Oxfam randomly selected a small group of interventions and studied them in depth to get a sense of how effective "typical" Oxfam work is.

Oxfam is currently running 362 "mature" projects. Of these they selected 26 spread across all categories of intervention Oxfam is involved in. 12 community based projects were analysed quantitatively with RCTs and 9 policy interventions were analysed by qualitative methods. Humanitarian projects were analysed in terms of how well they kept to recognised quality standards; 5 interventions were analysed in this way. In addition there are plans for 3 pilot studies looking into the accountability of projects.

Karl Hughes described the conclusions of the Effectiveness Review as "mixed". It is a testament to Oxfam's commitment to transparency that they have published even the unfavourable results. The general conclusion is that most Oxfam interventions make things better, but only on some measures. With further study it is reasonable to hope that Oxfam could dramatically improve their interventions, focusing on projects (like a flood preparedness programme in Pakistan) which have very strong evidence of effectiveness. Furthermore, they might stop altogether projects that have low or even negative impact, freeing up resources for those projects that work.

Room for improvement

The Effectiveness Review was an initial attempt, and it leaves plenty of room for improvement. For instance Oxfam did RCTs only in 12 of the 26 projects they studied. This suggests that there is room to expand the use of the "gold standard of evidence" much further.

The effectiveness review also failed to address the key question that Giving What We Can seeks to answer: How much good can we do at how great a cost? This is a very important point that we hope Oxfam's next Effectiveness Review will consider in detail. However, getting evidence of a positive effect is a very important first step.

Reasons for hope

There is a lot of good in this report. At the very least, it demonstrates that on average Oxfam has a positive effect. But more than this it tells Oxfam where to focus its efforts. Oxfam now needs to look at the very effective Pakistan disaster risk-management programme and try to replicate its success elsewhere. They also need to look carefully at one of the livelihood projects studied and try to understand why the control group ended up better off than the intervention group.

More generally, this report demonstrates that organisations like Oxfam can take evidence seriously without compromising their work on the ground. It is estimated that each study cost only £10,000 excluding staff time. This is very little, compared with the quarter of a billion that the organisation spends each year. Hopefully this first attempt will inspire Oxfam and other large charities to investigate their effectiveness more systematically. If these studies help Oxfam use its money 1% more effectively, then they will have paid for themselves many times over.

This Review also provides good examples of how to get useful evidence despite difficulties on the ground. In an RCT it is important for the control and intervention groups to be as similar as possible. But this is very hard in practice. For instance in the case of the Pakistan flood risk-management project there were statistically significant differences in the average age, population and distance to market between the two groups of villages. However, Oxfam demonstrated that this need not make the RCTs worthless. By using linear regression and other statistical techniques the report attempted to correct for these discrepancies.

This review was very preliminary, but it offers hope for the future. Key improvements would be a focus on quantified cost/benefit analysis and an expansion of the use of RCTs to a wider range of projects. But the fact that Oxfam has run and published such an extensive study, at a reasonable cost, is a milestone for effective charity. Hopefully other charities will follow their example.