Since 2003, ODI’s Research and Policy in Development (RAPID) programme has worked hard to improve the integration of local knowledge and research-based evidence into policy and practice. To mark 15 years of research and action on evidence-based decision-making, RAPID co-founders John Young and Simon Maxwell discuss what’s changed since the programme was set up, and what new challenges and opportunities are on the horizon.
How has the broader context which shapes evidence use changed to improve or impede the production and uptake of research-based evidence?
For me, there are two key contextual issues affecting the production and use of evidence currently. On the one hand there is the increasing availability of information and an increasing number of sources of that information. Anybody can publish anything now, it can be easily accessible and people can use it as they will – that’s a very dramatic change.
I also think there has been a shift in the policy-making community, and certainly in the political community, back to ideological-based policy-making, with less interest in evidence-based policy-making. There’s at least as much policy-based evidence-making as there is evidence-based policy-making.
But it’s not all negative. There are institutions and organisations being put in place to try and ensure that those making decisions about international development do use evidence. For example the Independent Commission for Aid Impact (ICAI), which was set up in 2011, does an excellent job of evaluating investments made by the UK Government’s Department for International Development (DFID).
There also seems to be increasing demand for convincing evidence of the value of aid, for governments wanting to try and maintain the current investment in aid. And let’s not forget the growing emphasis on usefulness and use of academic research beyond the academic sphere. The introduction of the Research Excellence Framework (REF) in 2014 is helping to shape the research that is done in universities, so there is now more ‘think tank-like’, policy-oriented research going on in UK academia than there was.
It has always been the case that policy is highly contested, but it does feel at the moment that we’re in a bit of a battle between the populists on one hand and the neoliberals on the other. Our job is to carry the flag for common sense, to say ‘there is a middle-way, there are policy options, here’s research, and – to researchers – here’s how you can sell it’.
I’d also add, to the changes John has identified, the shift towards the ‘results agenda’. ODI has done some really important work on the results agenda in aid, demonstrating it can be a straitjacket because ministers are keen to demonstrate results to their electorates, therefore civil servants require projects to be written in such a way that the results are very clear.
But we know that development is uncertain and conflictual and that you might not arrive at the point that you thought you were going to arrive. We know you can have really good projects which don’t deliver the results originally foreseen.
What have researchers and research organisations done to improve evidence use, and what still needs attention?
I think many organisations are much better at horizon-scanning and identifying what’s likely to be coming up, and shaping their research to address that, although this is fantastically difficult if you’re contract-driven. So that’s one dimension.
Another dimension is that there is an increasing recognition that you don’t do the research and then think about how to communicate it, you actually have to communicate throughout the whole process of research, and involve stakeholders throughout.
In RAPID we’ve found Outcome Mapping-based techniques a very powerful tool for encouraging researchers to do this. It helps them think through questions such as ‘what is the big problem we’re trying to solve here?’, ‘who are the key stakeholders?’, ‘what kind of evidence will help to persuade that person/those people?’ And people love doing it!
On the other hand, one of the things I worry about at night is the kind of ethical issue about being a think tank in this changing context where fact is less powerful than fiction, and appealing to people’s feelings is more effective than appealing to their intellect.
How far should we go down the road of actually using the results of our research to generate materials which are designed to appeal to those emotional sides of people? And how much should we insist on the kind of scientific principle that we must have the evidence to support the claims that we make and explain why we think this evidence is valid, the sort of rigour of the approach, you know… I don’t know what the answer to that is.
Back when RAPID was first established, we identified four essential skills for working in a think tank. These were: the ability to tell a good story; being well connected (so people would hear your story); being practical (so that you gave people sensible solutions); and thinking politically (so that you provided solutions at the right time and in the right form).
We also agreed that relationship-building is critical. Not just exchanging business card with ministers, but knowing enough about their problems so that you can be useful, and so they trust you not to betray confidences.
The four ‘essentials’ remain really important today, timeliness in particular. The speed of research communication has definitely increased in recent years – you can’t wait for an academic journal to referee your paper and publish it and then use it, because you need to be on Twitter the same afternoon. But in this world of ‘fake news’ and huge political pressures that fifth, extra ‘essential’ – the ability to build relationships, build trust – has become even more central.
What role has collaboration (between researchers, or organisations) played in improving the use of evidence?
You can imagine that if you’re a policy-maker, one of the problems you face is that there’s a cacophony of voices. For every paper written, there are ten other papers, all of which say different things about the same problem. By working together and collaborating, we can at least simplify a little bit.
I currently chair the European Think Tanks Group (ETTG), which has five think tanks (ODI is one), working together on European development cooperation policy. Each think tank has its own work and we respect their independence and their autonomy and their different views, but we try to collaborate in forging a consensus, and because we work together and publish papers together, and co-host events together, we carry more weight, we get better access to senior policy-makers.
In this context, an interesting challenge for development researchers is that so many of the problems we deal with in development are actually global issues. For example, agricultural innovations in one country – such as crop-planting and weed control done entirely by drones, as recently tested in the UK – influence prices and production possibilities in other countries. So if you’re working with smallholder farmers in Ethiopia you need to know what’s happening in the UK, otherwise you’re going to give the wrong advice.
Another question, thinking about collaboration, is how much do you engage with the public and with activists? I wonder whether, in this age of social media and, in a way, higher levels of citizen engagement, this is a challenge or an opportunity for researchers and research organisations.
Collaboration between research organisations has played a huge role in allowing research to be scaled up from the local to the global. When I first joined ODI it was the heyday of the large European donor-funded collaborative research programmes working on a global scale – many projects at field-level but done in collaboration with local organisations and as a kind of programme of research rather than as an individual research project. Sadly I think that’s declining actually, with some exceptions, notably the Global Challenges Research Fund (GCRF). Many of the European research donors no longer fund these big research programmes.
However as Simon remarks in more detail, we are seeing more independent collaboration between organisations, for example the Think Tank 20 Group that accompanies the G20 process, and the emergence of partnerships across southern organisations, like the Southern Voice initiative, a network of think tanks across Africa, Latin America and Asia.
How has research funding changed, and how has this affected the production and use of research-based evidence?
The way that donors fund research has definitely changed, and I think the best example is the Global Challenges Research Fund (GCRF), which very explicitly retains that high-level focus on ‘what do we need to know in order to be able to contribute to solving these problems’. It is working up towards the big picture from the detailed research, while retaining a very strong emphasis on the quality of the research done, which means you’ve got to define the research questions very clearly.
You’ve also got to commit to the continuous interaction of researchers and policy-makers so that they collaborate together to find a solution to a common problem rather than suddenly having to provide evidence to answer a question which they had not been aware of beforehand.
I’ve also been quite impressed recently by research council funding. They are very transparent in terms of the methods they use in order to select the research projects. That said, I think there’s a kind of structural problem with this kind of funding, in that the quality of the research is the primary requirement in a research programme rather than the overall impact of the programme. So you get a whole load of research on a particular topic and nothing on others.
In my experience, people don’t want to fund individual institutions. What they do want to fund is solving a problem. So if you can go to somebody and say ‘malaria? We can crack that! Together we can crack that!’ you’ve got a better chance. What I really like about the new ODI strategy is that it focuses very much on the problems we can solve together.
Also, as I’ve mentioned in an answer to one of the other questions, a key challenge for development research is how you iterate between the general and the specific, so it’s helpful if research funders allow you to contextualise your work and to make connections. Increasingly, foundations can offer that kind of flexibility, but of course the risk is that some foundations are ‘good guys’ and some are potentially ‘bad guys’.
What you don’t want is lots of really well-funded think tanks with the wrong views but having a lot of influence, and I do fear sometimes that that is happening. So, there’s an important corollary in this, that think tanks need funding and think tank collaboration needs funding, but the funders also need to be transparent about who they are funding and transparent and accountable for what they’re doing.