ODI Logo ODI

Trending

Our Programmes

Search

Newsletter

Sign up to our newsletter.

Follow ODI

Strengthening learning from research and evaluation: going with the grain

Research reports

Written by Harry Jones

Research reports

The aim of this study is to put forward actionable recommendations on learning lessons in DFID, particularly from evaluation studies and research findings.

Methodology

The study included 38 semi-structured interviews in Palace Street, Abercrombie House, and via video conference or telephone with staff in country offices; an online survey carried out by EVD and answered by 254 respondents; a review of documents obtained on a rolling basis in order to provide further insights, comparisons and triangulation; and analysis and iterations in particular with EVD staff, IACDI members and other DFID experts.

Findings and conclusions

Three perspectives of learning were identified in the study:

First, from the starting point of DFID’s research and evaluation outputs, the question of whether lessons are learned focuses on how influential that work is, whether findings and recommendations are taken up in policy and programming and acted upon.

Second, from the point of view of decision-making and action, the question of lesson learning becomes a matter of looking at the extent to which evidence (and in particular, that emerging from DFID’s research and evaluation) feeds into and informs the process of policy making and programming.

Third, looking at learning from the perspective of DFID as an organisation, the question of lesson learning focuses on how knowledge within DFID is captured, shared and used, as and where it is needed.

The study suggests that DFID is much better (or at least more comfortable) at using the findings of research and evaluation than organisational learning. Similarly, it is much better at using research and evaluation findings during, or as part of a project cycle, than in more complex and emergent decision making processes.

From the analysis of decision-making models and the role that evaluation/research based evidence plays, three main conclusions emerge:

1.    Initiatives that promote a sense of ownership of research and evaluations and those that support the development and strengthening of interpersonal learning networks work well in DFID. In other words, learning in DFID (of the kind that promotes the incorporation of analysis into decision making and the development of a learning organisation) is more akin to a system with fewer intermediaries and more direct relations between users and producers of knowledge.

2.    Formal mechanisms directed at lesson-learning seem to be more useful where it is possible to ‘go with the grain’ of what is required for learning in the circumstances faced; and

3.    In line with this DFID’s systems are not properly set up to deal with the complexity of problems the organisation faces.

Recommendations

One very clear need identified by this study is the need for a strong and coherent approach to improving lesson-learning – in particular at the organisation level. This study has shown the need for serious efforts to systematise, join up and coordinate lesson-learning. At the moment, a number of departments have ongoing initiatives which relate to lesson-learning, but they currently do not appear to add up to more than the sum of their parts and individual elements cannot be presumed to provide the impetus on their own. In addition, the timing seems ripe for such a move, as initiatives are undertaken by the new ministerial team to improve evidence-informed decision making and the use of evaluations.

On a more practical level, DFID should aim to develop more formal and long term relationships with key UK-based think tanks and research centres and globally to provide high quality short and long term research and evaluation-based lessons to DFID and DFID staff. Formal relations and planned spaces for communicating preliminary and final findings of the work would reduce the cost of advocating and communicating (from outside) and would facilitate feedback to ensure that research agendas remain aligned to DFID’s current and future interests and needs.

There are also a number of recommendations relating to various parts of DFID[1]; these would be more effective as part of a joined-up, coherent effort, but also would have an impact on their own.

Strengthen the research and research uptake teams to act more as experts or matchmakers between researchers and policymakers rather than focus their attention on synthesis and dissemination and the support of intermediary portals and project.

Target research funding more effectively and provide clearer signals to researchers (outside DFID) about the specific current and future information needs of the organisation at the global, regional and national levels.

Focus efforts and resources on improving the communication of research outputs and findings through mechanisms that promote and strengthen professional relationships between researchers and policymakers, such as holding regular seminars and events and developing bespoke templates for the publication and dissemination of findings.

Finally, continue to promote the use of funds available for quick policy research within policy teams and maybe encourage DFID staff to take part in the studies. 

For evaluations, similar approaches should be pursued but particular attention should be given to clarifying the evaluation function within DFID.

Additionally, direct support to DFID teams and staff undertaking evaluations should be provided including a review of TORs to ensure that recommendations are taken forward by both the evaluators and those responsible within the organisation.

To improve knowledge and information management as a whole, DFID could create an online platform that effectively makes the links between projects, questions and people. For example, it could allow one to search for DFID staff who have worked in a particular country for a specified period, or list all of the advisers who have worked or currently work on governance projects in a given region. Clear, searchable statements of current responsibilities (as well as the immediate projects a staff member is working on) would allow researchers to target information more effectively at those who may need it –when they need it.

Measures to deal with the issue of poor institutional memory in the context of staff turnover need to be addressed. Ongoing plans for improving handover procedures and institutionalising exit interviews are likely to be hugely important. A possible alternative to high staff turnover would be to establish closer relations with the research teams of the Foreign and Commonwealth Office’s Directorate for Strategy Policy Planning and Analysis who specialise on specific geographical areas and themes and therefore maintain the organisation’s institutional memory (and intelligence).

Finally, human resources policies are necessary in order to strengthen the advisory cadre. Given their importance for learning, policies on the placement of advisers should be given attention, and efforts be made to remedy what is seen as an excessively high turnover in relation to the requirements of doing a good job.

Additionally, efforts should be made to ‘raise the bar’ of expertise among advisors (who are the main interlocutors between policy and research): on research skills, academic qualifications or experience and evidence use. This is related to efforts to reduce the range of issues or interventions managed by single advisors so that they can focus on learning rather than using evidence.

Harry Jones and Enrique Mendizabal