'Dumbing down' the audience

Enrique Mendizabal
Enrique Mendizabal
5 July 2010
Articles and blogs

Evidence-informed policies are central to ODI's mission to link together high quality applied research with practical policy advice. That is why the Research and Policy in Development (RAPID) Programme works to understand better the relationship between research, policy and practice and to figure out how think tanks and other similar organisations can ensure strong links between them. To achieve this in developing countries we have worked from multiple angles. On the one hand, we are helping researchers to com­municate their research and engage with policy processes – an action we have further supported by interacting with donor agencies to im­prove the way they fund and support researchers. On the other hand, we have been working with progressive policy-makers in developing countries to improve their ability to bring evidence to bear in policy discussions.


In approaching the first goal, improving the way researchers communicate with policy-makers, we originally turned to surveys to understand what policy-makers were looking for. The answer was clear: policy-makers prefer shorter and more practical summaries over longer aca­demic papers. This is part of the logic behind the surge in briefing papers, opinion pieces, blogs and other multimedia communications from ODI and other organisations.

Our work developing the skills of researchers to be able to priori­tise messages and develop useful recommendations from their work has been met with an ever increasing demand – though researchers have criticised the process, claiming that prioritising infor­mation is akin to ‘dumbing down' or oversimplifying their work.

Responding to policy-makers' needs is important, and being able to clarify and communicate research is an essen­tial skill for development researchers. But it may have unin­tended consequences. By always giving policy-makers what they want – shorter, simpler and easier things to read – are we implicitly accepting that they should not be held up to the same standards as other pro­fessionals? In short, are we unintentionally ‘dumbing down' the audience?

Professionals (like journalists, architects, managers and doc­tors) need credible knowledge to carry out their work. Successful professionals do not just check the lat­est blogs or browse the most recent tweets; instead they study academic publications and trade magazines, attend professional conferences and continuing education courses to update their expert knowledge. As a consequence of this demand, they are showered with a wide range of special­ised publications and support services that, rather than simplify and digest things for them, intellectually stimulate and chal­lenge them.

This effort reflects the unavoidable fact that they are all dealing with complex problems. Caring for a patient, finding the right balance between the aesthetics and struc­ture of a building, or estimating the right level of growth or economic risk for a country are not simple tasks. The enormity of the problems faced requires these profes­sionals to engage with the complexity of the problem and work with it. To deal with this, we have developed models and frameworks that help to recognise and understand our environment and decide what to do. However, as Cleaver and Franks found out when attempting to communicate a framework for water governance, some policy-makers often consider these to be far too compli­cated and unnecessary; what they want, what they need, are solutions – three messages or actions points.

Somehow, we have come to accept that policy-makers in the development sector (and I include policy-makers of developing and developed countries in this group) don't need to engage with the complexity of the problems they face and that it is enough for them to know what to do. If they can muster an action plan, that is sufficient.

This is not just dangerous policy-making in the short term. In the long term, by separating research from the influencing process, we may be providing policy-makers with incentives against investing in their own capacity. If they can always expect a two-page briefing with simple steps to follow, then why should they ever bother reading a full study and getting to the bottom of the arguments? Why check the data used in the analysis, or the methodological robustness of the analy­sis itself? And if they don't have to, then why bother learning how to do it in the first place?

If policy-makers are responsible for making policy deci­sions, and if we want these to be research-based, then it is in our best interest that they be capable of challenging the evidence – any evidence presented to them. We know that bad science leads to bad policy – but bad science has little chance of getting through the critical filters of competent policy-makers.

Think tanks and other research organisations don't ex­ist only to do research and directly seek policy changes: they are also responsible, whether they like it or not, for the development of future generations of policy-makers and promoting the debate of new ideas and supporting the environment where these happen. If communicating in ever simpler and flashier terms goes as far as removing all engagement with the research itself (the definition of the problem, methods, models, frameworks, etc.), then we are no different from any other interest groups that influence policy based on their beliefs or allegiances.

What is worse, we might be de-skilling ourselves – losing the competencies required for long term research, academic writing and long term policy engagement. This is a particularly serious challenge for cash-strapped organisations in many developing countries.

We should be holding policy-makers, and anyone in­volved in using research to promote, make and implement decisions that affect the lives of the poorest people in the world, to the highest standards; always expect them to be experts in their field – or at least surround themselves with competent ad­visors (who should certainly never be satisfied with a two pager on issues critical to their work). In contexts where the capacity to make decisions is not present, and where the audience is not capable of engag­ing with the most basic nuances of the research on which recommendations are based, then our objectives should be focused on changing this situation. As policy research institutes maybe we should prioritise and aim our sights at improv­ing education systems, research capacity and civil service, reform rather than advocating for the adoption of yet another brilliant idea.


Enrique Mendizabal