How to promote more informed policy and practice

17 June 2004 11:00 - 12:30 GMT+00
Workshop

Speakers:

Julius Court, ODI

John Young, ODI

Arcadie Barbarosie, Director, IPP

Caroline Newman, Programme Officer, LGI

Description

This workshop was designed for the directors of senior staff of policy research centers in Central and Eastern Europe and the former Soviet Union. The purpose was to improve participants' ability to analyze the context within which they work and develop strategies to improve the policy impact of their work.

Chisinau, Moldova

Introductions and Background

Arcadie Barbarosie (Director, IPP) and Caroline Newman (Programme Officer, LGI) welcomed participants to the workshop.

Julius Court gave the background to the meeting. The OSI public policy network and ODI have been working at the intersection of research and policy. More recently, they have also been trying to identify ways to increase the impact of research-based evidence on policy and practice. ODI has also been studying when, why and how evidence informs policy and practice in different contexts around the world.

John Young introduced the structure of the workshop, namely:

  • Introduction
  • Research-Policy processes in the CEE/FSU Region
  • Learn about the Context: Evidence: Links framewors
  • Share experiences about approaches to strengthen research-policy links
  • Maximizing policy influence: tools and approaches
  • Conclusions and evaluation

John Young also provided a background for the discussion. Better links between researchers, policy makers and civil society groups can help save lives, reduce poverty and improve quality of life. All too often, it seems that researchers and policy makers live in parallel universes. Researchers cannot understand why there is resistance to policy change despite clear and convincing evidence. Policy makers bemoan the inability of many researchers to make their findings accessible, digestible and in time for policy discussions.

We define both research and policy very broadly. By research we do not just mean classical scientific research. It includes any systematic learning process - from theory building and data collection, to evaluation action research. Similarly, policy is not just narrowly defined as a set of policy documents or legislation; it is about setting a deliberate course of action and then implementing it. It includes the setting of policy agendas, official policy documents, legislation, changes in patterns of government spending to implement policies, and the whole process of implementation. It is also about what happens on the ground: a policy is worth nothing unless it results in actual change. These are all relevant if we want to try to make policy more evidence-based and see the results of our research adopted in policy and practice.

There is a vast amount of existing theory on this subject. But most of it is from developed, OECD countries and there is very little systematic research on the interface of research and policy in transition and developing countries. This is a serious problem given the massive diversity of cultural, economic, and political contexts. Furthermore, international actors have an exaggerated impact on research and policy processes. This makes it difficult to draw valid generalizations and lessons from existing experience and theory.

However, some of the theory does seem particularly relevant for developing countries. Roe identifies the importance of policy narratives. Policy makers are strongly influenced by very simple stories such as the 'tragedy of the commons'. Many of these simple stories are wrong, but they are nevertheless very attractive and powerful. Lipsky points out the importance of street-level bureaucrats. It is the people who implement policy who very often have the greatest impact on how that policy translates into practice. Without understanding the policy implementation process and the people involved in it, it is impossible to know how to influence it to promote better policies and practice.

Malcolm Gladwell's book 'The Tipping Point' describes how social epidemics spread. It is about the different types of people who are involved in the policy process: connectors, who know a lot of people; mavens, who hoover up and digest information; and salesmen who are very good at 'selling' ideas. He talks about how the context affects how people behave. In experiment in the US, researchers sent student on errands all over the campus, and arranged for them to pass somebody in distress who clearly needed help, then analyzed the factors which influenced whether the students stopped to help or not. The most important factor seemed to be whether the student was in a hurry or not. Gladwell describes how the conjunction of these factors create the 'tipping points' when ideas suddenly spread and are adopted.

Policy making used to be widely thought of as a linear and logical process, in which policy makers identified a problem, commissioned research, took note of the results and made sensible policies which were then implemented. Clearly that is not the case. Policy making is a dynamic, complex, chaotic process. Clay and Schaffer's book 'Room for Manoeuvre' in 1984 described 'the whole life of policy is a chaos of purposes and accidents. It is not at all a matter of the rational implementation of decisions through selected strategies'. That is increasingly recognised as a more realistic description of the policy process than the linear rational model - though the truth is probably somewhere in the middle.

Furthermore, as Steve Omamo pointed out in a recent report on policy research on African agriculture: 'Most policy research on African agriculture is irrelevant to agricultural and overall economic policy in Africa'.

Vincent Cable, a Member of Parliament in the UK, outlines five S's that limit evidence-based decision-making from the side of legislators: Speed, Superficiality, Spin, Secrecy and Scientific ignorance.

  • Speed: policy makers are under chronic time pressure and are forced to process information quickly.
  • Superficiality: each policy maker has to cover vast thematic fields, and cannot possibly have in depth knowledge about every issue in those areas. They are therefore heavily dependent on the knowledge and integrity of the people who inform them.
  • Spin: in the political world, perception is very important. Perception guides political decisions.
  • Secrecy: how to relate to evidence that is secret.
  • Scientific ignorance: there is a growing suspicion towards science and scientists among the public, which will have an effect on policies.

It is not really surprising that the link between research and policy is tenuous and difficult to understand if policy processes are complex and chaotic, and much research is not very policy relevant.

Research-Policy Processes in the CEE/FSU Region

Group work and discussion focused on two key questions:

  1. What are the key factors affecting the policy impact of your Institutes' work? Some of the main points highlighted included:
  • Context Issues
    • Social and political groups (i.e. politics)
    • Views of high authorities
    • Lobby groups (e.g. oil industry)
    • Stability - of organisation and of state
    • Public opinion / prejudice
    • There is the feeling among policy makers that 'Government knows best' and is the only legitimate body to make decisions (therefore little need for think tanks)
    • Lack of capacity of government to work with think tanks - low profession ability
    • Lack of interest from bureaucrats
    • Poor education - poor analytical capacity of policy makers
    • Decision making lack of transparency and conservatism are problems
    • Concept of 'public policy' not understood in some countries
    • Lack of understanding of policy research in government
  • Think Tank Related Issues and Researchers
    • Expertise and reputation of the institute
    • Financial stability of institute
    • Financial independence
    • Degree of understanding of normative policies
    • Balance of political affiliation ('political correctness')
    • Ability to communicate
    • Ability to do comparative research
    • Ability to identify 'hot issues' in advance
    • Timing
  • Links to Policy makers, Media and Donors
    • Relationship with policy makers
    • Being well-connected to government, powerful donors
    • Political affiliation - better to be non-aligned
    • Good media relations
    • Competition between different policy institutes
    • Relationship with donors
  1. What are the key factors affecting research policy interaction in the CEE/FSU region? In addition to the points above, some of the main issues included:
  • Political instability and changeability
  • Politicians express the will of the people
  • Government not interested / able to use policy research
  • Separation between academic and policy organisations
  • Poor networking between government and think tanks
  • Level of market relations matters
  • Development of civil society matters
  • Influence of donors / EU agenda; determine research; sell ideas through think tanks
  • Policy makers 'buy' results
  • Policy makers bias the research or 'cherry pick'
  • Receptivity to external ideas - too much, too little
  • Historical lack of cooperation
  • Common programmes / issues across the region - trafficking, migration
  • Financial dependency of think tanks
  • Different priorities (orientation towards EU affects political contexts significantly)
  • Difference between research and policy priorities due to donors preferences
  • Donors sell their ideas through think tanks
  • Image / visibility
  • Discrepancy between academic and applied research

The ODI Framework and Findings: Context, Evidence and Links

As we have seen from the literature and discussion of issues in the region, the link between research and policy is tenuous and difficult to understand because policy processes are complex and much research is not very policy relevant.

ODI's Context, Evidence and Links Framework is an analytical and practical tool. The aim is to simplify the complexity of how evidence contributes to the policy process, so that policy makers and researchers can make decisions about how they do their work to maximise the chance that policies are evidence-based, and that research does have a positive impact on policy and practice. It is based on a thorough review of the literature and a wide range of case studies at international, regional and national level across the developing world.

Four broad groups of factors have been identified, the first of which we call external influences. These are the factors outside a particular country which affect policy makers and policy processes within the country. For example, in small, heavily indebted countries, World Bank and bilateral donor policies and practices can be very influential. In this region, EU accession has substantially affected policy processes and the uptake of research. At national level the factors fall into three main areas. The political context includes the people, institutions and processes involved in policy making. The evidence arena is about the type and quality of research and how it is communicated. The third arena links is about the mechanisms affecting how evidence gets into the policy process or not.

The results of our research indicate that the Political Context area is probably the most important. This includes issues about governance including democracy, the openness of the policy process and media and academic freedom. Policy making is a very political process. If there is a large degree of contestation about an issue it is very difficult to make progress. By contrast, if there is a large demand, particularly spurred by a crisis where policy makers are seeking a solution, the chances are much greater that research will be used. The process of policy implementation is also a crucial area. The street-level-bureaucrats who actually implement policy can exert an enormous influence.

Despite what we said earlier, policy processes are clearly not completely chaotic. There are budget and legislative cycles which may be quite transparent, and if researchers understand and feed their results into these cycles they have a much greater chance of success. Policy windows (resulting from crises) provide another entry point. In these situations, where the decision is likely to be taken very quickly, researchers need to respond very quickly and they are not always able to do so.

In terms of the Evidence, there are a set of issues which seem to come out most clearly and which make a big difference to whether research is taken up. If you can provide a solution to a problem and are able to put these on the table, you stand a greater chance of being able to influence policy. There are a number of dimensions to this. The first is relevance: working on an issue which is topical and relevant to policy makers makes it easier for them to engage with the research. The second is operational usefulness: this is not just about producing research which is topically relevant, but about providing research which suggests how that policy maker may do something differently in his work. The third issue is credibility: this is about not only the content of the message or the approach of the research, but about who is saying it and their recognised expertise. Communication is crucial in both directions in terms of: researchers listening to policy makers; engaging policy makers in the research right from the beginning; and keeping them involved or in touch with that process.

The issue of Links or mechanisms for bridging research and policy is perhaps the area about which we know least and it is a very complex area. We know that issues of trust and legitimacy, networks and working groups, are important. But there are further questions about what makes a successful working group and when it draws up research and helps to bridge the gap between research and policy. This is a challenge which we still face and on which our ongoing work will provide further insights.

Finally, there are some large External Influences which shape and affect all of these issues about how research and evidence is used. In the CEE region the issue of EU accession is providing significant incentives for these countries to call in and engage with their researchers to inform policy for joining the EU. In Africa, there are Poverty Reduction Strategy Papers (PRSPs) which provide incentives for local policy makers to pull in information from their researchers. What donors do also matters. Donors such as OSI and DFID are trying new things and the emphasis is shifting away from basic research towards policy research and mechanisms for bringing research into practice. The final point is that donors can have a mixed role. If large organisations such as the International Monetary Fund seek to identify and impose their findings of what the key issues are on a recipient country, there is likely to be a considerable backlash. On the other hand, donors can add legitimacy to the findings of local research which were previously considered problematic.

The framework we present here is a generic, perhaps ideal, model. In some situations there will not be much overlap between the different spheres, in other cases the overlap may vary considerably. In situations where there is little political will for change, the context and evidence spheres may not overlap at all and intermediaries are essential to bring the evidence to the notice of policy makers (the daisy-chain model). The ivory tower model describes situations where university-based academic research takes place in complete isolation from the real world outside. Furthermore, the relative importance of each of the spheres may be different in different situations, and may change over time. The framework should perhaps be viewed as a trio of floating spheres of variable size and degree of overlap.

Following the presentation there were a range of questions regarding the presentation and work of ODI. These included:

  • How to ensure credibility?
  • Funding for strategic work?
  • How much research do you need in a think tank?
  • Who are the stakeholders?
  • Sustainability of think tanks?
  • Balance of political affiliation ('political correctness')
  • What should be the priority?
  • Evaluation of impact - bottom line measurement of 'success of think tanks'
  • Balance between objectivity and subjectivity
  • What is ODI's position via a vis competitors?

Using the Framework Towards an Influence Strategy

An interesting thing about the framework is how well it maps onto real-life activities. The political context sphere maps onto politics and policy making, evidence onto the processes of research, learning and thinking, and links onto networking, the media and advocacy. Even the overlapping areas map onto recognisable activities. The intersection of the political context and evidence represents the process of policy analysis - the study of how to implement and the likely impact of specific policies. The overlap between evidence and links is the process of academic discourse through publications and conferences, and the area between links and political context is the world of campaigning and lobbying. The area in the middle - the bulls-eye - where convincing evidence providing a practical solution to a current policy problem, that is supported by and brought to the attention of policy makers by actors in all three areas, is where there is likely to be the most immediate link between evidence and policy.

So, if you are a researcher, policy maker or development practitioner with the desire to promote a particular policy you need to know about:

  • the external environment which might influence how people think or behave: who are the key external actors? what is their agenda? how do they influence the political context?
  • the political context you are working in: is there political interest in change? is there room for manoeuvre? how do policy makers perceive the problem?
  • the evidence you have, or could get: is there enough of it? is it convincing? is it relevant? is it practically useful? are the concepts familiar or new? does it need re-packaging?
  • and the links that exist to bring the evidence to the attention of policy makers: who are the key organisations and individuals? are there existing networks to use? what is the best way to transfer the information: face-to-face or through the media or campaigns?

For policy institutes wishing to influence policy and practice, understanding the context, evidence and links is just the first part of the process. Our case studies also identify a number of practical things that researchers need to do to influence policy and practice, and how to do them.

  • In the political context arena you need to get to know the policy makers, identify friends and foes, prepare for regular policy opportunities and look out for policy windows. One of the best ways is to work with them through commissions, and establish an approach that combines a strategic focus on current issues with the ability to respond rapidly to unexpected opportunities.
  • Make sure your evidence is credible. This has much to do with your long term reputation. Provide practical solutions to policy problems in familiar language and concepts. Action-research using pilot projects to generate legitimacy seems to be particularly powerful.
  • Make the most of the existing links by getting to know the other actors, working through existing networks, and building coalitions and partnerships. Identify the key individuals who can help. You need people who can network with others, mavens to absorb and process information, and good salesmen who can convince the sceptics. You may also need to use informal 'shadow networks' as well as more formal channels.

Influencing policy change is an art as much as a science, but there are a wide range of well known and often straightforward tools that can provide powerful insights and help to maximize your chances of impact on policy.

A Case Study: Paravets in Kenya
To give an example of using the framework and the types of issues in the Context, Evidence and Links arenas, we use a case study from Kenya. Livestock services, were among the first rural services targeted for privatisation under structural adjustment programmes, particularly in Sub-Saharan Africa. The veterinary profession however has been very slow to respond, and the increasing financial constraints effectively paralysed government services in the late 1980s and early 1990s. During this period NGOs introduced a new model of community-based livestock services. Intermediate Technology Development Group (ITDG) was one of the early pioneers in the mid 1980s, and adopted an action-research approach with a clear objective to use the results, if positive, to influence the policy environment to allow the approaches to be widely replicated. This case study explores the reasons why, despite the outstanding success of the new decentralized community-based animal health care (DAHC) approaches, it took over 15 years to convince policy makers to develop policies and legislation to allow this to happen - which still have not been formally adopted, despite the proliferation of community-based livestock services throughout the arid and semi-arid parts of Kenya.

The case study consists of an historical narrative leading up to the observed policy change and explores why those policy decisions and practices took place and assesses the role of research in that process.

The key events which seem to have contributed to the policy shift in Kenya were:

  1. The arrival of ITDG in 1986 with an explicit focus on developing and testing new approaches, then seeking to influence the policy environment so they can be implemented more widely.
  2. The first ITDG vets workshop in 1988 which brought together DAHC practitioners from several project around the country marked a significant increase in interactions between researchers/practitioners and policy makers.
  3. Dr Wamokoya's appointment as Director of Veterinary Services (DVS) in 1990, and his emphasis on veterinary professionalism, reversed an emerging interest in policy reform.
  4. The establishment of bilateral DAHC workshop strengthened the emerging network of practitioners and links between policy makers and practitioners.
  5. Dr Kajume's, then Provincial Director of Veterinary Services for Eastern Province, attendance of the 1993 vets workshop marked a further improvement in linkages between researchers, practitioners and policy makers.
  6. The gradual increase in number of agencies in training CAHWs from 1994 to 1997 further strengthened the evidence in favour of DAH approaches, and also contributed to:
  7. The publication of a letter by the Kenya Veterinary Board (KVB) in 1998 threatening to punish livestock owners and veterinarians involved in DAH programmes in an attempt to stop what they regarded as a dangerous approach. The letter however had the opposite effect: it forced all stakeholders together into a policy network to try to find a solution to the problem. Supporters in the government used the crisis to launch a multi-stakeholder study (known widely as the Hübl study) which significantly increased the weight of evidence still further.
  8. A multi-stakeholder workshop in Meru in 1999 (based on ITDG's Vets Workshops) provided a clear signal from policy makers that they were interested in finding a solution, which improved the political climate for change still further.
  9. The political climate couldn't have been better while Julius Kajume was acting DVS in 1999, but deteriorated with the appointment of a more conservative Dr Chong in 2000.
  10. Increasing opposition to the new policies from the Kenya Veterinary Association in 2001 both undermined the policy coalition reducing the link between researchers/practitioners and policy makers, and complicated and worsened the political climate.

The animal health care case study reaffirms much of the current theory of research-policy linkages. The policy process was influenced far more by the political context than by anything else, and personalities and personal relationships were at least as important as any formal relationships and structures.

The crisis caused by the KVB letter in 1998 was clearly the tipping point. Beforehand there was a long period where CAHW schemes gradually proliferated, generating powerful evidence of their value, and providing an issue around which different groups of stakeholders, supporters and antagonists, could form formal and informal networks. Afterwards, there was a surprisingly long process where all stakeholders came together to develop a new policy framework.

Although some of the external NGOs promoting the approach had been influenced by emerging ideas in the development discourse, formal research seems to have contributed relatively little to the policy process in Kenya, and research reports even less so (with the exception of the Hübl study). Evidence generated by working CAHW schemes, communicated directly to visitors by livestock owners and the animal health staff directly involved in them seems to have been much more important. Early on, this evidence contributed to the rising popularity of DAH programmes with donors and field veterinarians, and in the mid 1990s, albeit second hand, to the alarm of the KVB resulting in their publication in the national press, which brought everybody, including the KVB itself, and resulted in a new policy shift in favour of the approach.

With the benefit of hindsight, distance and the results of this study, it is possible to suggest some changes to what was done, which might have accelerated the process. These include:

  1. Greater effort to understand the political context - the legal and policy framework, the key actors, their attitudes and influences, and other reform processes.
  2. Greater effort, earlier on, to get government staff, especially those opposed to the idea, to visit working CAHW schemes and learn about them at first hand.
  3. Effort to generate interest among non-veterinary staff, and parliamentarians.
  4. A clearer communication strategy to influence government vets and government policy.
  5. More effort to get to know the key players - the Director and Deputy Directors of Veterinary Services in Nairobi - and figure out how best to influence them.
  6. More effort to understand the policy process in Kenya - how new ideas become incorporated into policy, and new legislation enacted.

It is also clear that working with local communities to develop effective and sustainable examples of new approaches is essential to prove their effectiveness and acquire the legitimacy to advocate for change. That takes time, and the early pioneers of the approach in Kenya deserve recognition for the efforts they have made over the last 17 years.

Policy Influence: Examples and Practical Lessons from CEE/FSU

Group work focused on examples of research-policy influence from participants and the main practical lessons from them. Some examples from the groups include:

  • An example from Moldova concerning the Transdniestr conflict. Transdniestr is a de facto state. In July 2002 an approach to solve the conflict was published but would have made Moldova a Russian protectorate. There was pressure for this from OECD countries. Some publications were prepared. Demonstrations occurred in Moldova but these had no effect. George Soros spent 24 hours in the country. Stakeholders in Moldova highlighted their concerns and analysis, and convinced him of the need to convince other governments not to support the planned approach. Next morning at the American Ambassador's residence he convinced the Americans not to support it, and said so at press conference. This highlights the key need to identify champions.
  • It is possible to promote a policy without doing research. One example is the European Neighbour Policy to promote democratization in neighbouring countries. The target was European Union policy makers and neighbouring country governments. OSI developed an action plan and timeline of when and databases of whom to target. We wrote open letters to drafters; mobilized partners with debates and brainstorming; networked in Brussels and CEE; held follow up meetings; briefed journalists; and wrote a letter to George Soros.
  • A study on refugees in Georgia was never promoted because, although the analysis was sound, it was critical of the new government in its honeymoon period.
  • Book on economic growth being promoted by an NGO; they wanted to involve the media - but the media weren't interested in the topic.
  • OSI Brussels got a new budget line in the EC budget - it not only had to lobby for the initiative, but actually had to write the documents for the EC to facilitate the shift.
  • Working to introduce private pensions learned from other countries; law adopted in 2000 then the election occurred and the new government withdrew law.
  • It is often the case that you work with policy makers (in this case the EC) to try to change policy, but then staff move and you have to start again.

Some specific practical approaches that the group highlighted include:

  • Be strategic - choose right cases
  • Strategic stakeholder planning
  • Identify champions
  • Top politician's attention is crucial, media (TV) can serve as alternative
  • Do it in practice - provide practical help to policy makers
  • Timing
  • Seize political opportunities - elections etc (doesn't always work)
  • Communication (to different levels of decision maker)
  • Communication - if media not interested - education for journalists
  • Networking
  • Finding funding
  • Credibility and legitimacy do matter

Some Practical Tools

ODI work has suggested that think tanks need to be able to:

  • Understand the political context
  • Do credible research
  • Communicate effectively
  • Work with others

This has implications for organizational capacities and strategies concerning staff, processes and funding. Influencing policy change is an art as much as a science, but there are a wide range of well known and often straightforward tools that can provide powerful insights and help to maximize your chances of impact on policy.

We've already seen how ODI's RAPID Framework can help you to understand the policy area you are working in. There are many other types of tools that are useful. Other useful tools to help to understand the policy context include: Stakeholder Analysis; Forcefield Analysis; Writeshops; Policy Mapping; and Political Context Mapping. This is vital in terms of developing an influence strategy. There is a wide set of research tools - from case studies to action research - that can help generate new or better evidence to support your case.

The key communications questions are: Who do I want to convince? What do I want them to do? What will convince them? What relevant material do I have? A SWOT analysis can help to focus a communications strategy on the key messages and targets, and using the media can help you to reach a wide audience.

Many tools have also been developed by organisations involved in lobbying, advocacy and campaigning for pro-poor change. One example of such work is that being undertaken in the LGI Fellowships programme. Policy makers from around the regions are undertaking exercises to map the policy process and preparing policy papers to try to shift policy in their specific contexts. Adapted from the work of Merilee Grindle, the aim of the mapping exercise is to describe: who makes decisions? in what ways, formal and informal, are policies made? A key aspect is to analyse the different interests. The approach taken is very comprehensive, involving reviews of the literature and media, interviews, experience and focus groups. It involves a description of the processes (formal and informal), followed by assessments of the ratings of the influence of different actors. The exercises have given a good indication of where decisions are made and who are the key stakeholders. The mapping has assessed the roles and influence of groups such as government, parliament, civil society, judiciary, and private sector. It has also assessed how the roles and influence vary at different levels: local, national, international.

Force field analysis is widely used to inform decision-making, and in particular in planning and implementing change management programmes in organizations. It is also a useful method for gaining a comprehensive view of the different forces (their source and strength) acting on a potential policy change and is therefore a very powerful tool for analyzing the possibilities for influencing policy. Force field analysis can clarify the 'driving forces' and identify obstacles or 'restraining forces' to change. For bridging research and policy, it can be used to analyse the forces affecting a situation or to assess the forces affecting whether particular research might be adopted as policy. It might also be used to identify where research may help tip forces towards a change.

How is a forcefield analysis carried out? The first step is to discuss and agree on the current situation and the goal of the policy or institutional change. All the forces for change should then be listed in one column and all forces against change in the other column. The next step is to brainstorm the 'driving' and 'restraining' forces and write them in the appropriate column. The 'driving' and 'restraining' forces should be sorted on common themes and/or prioritised according to their 'magnitude' towards change by assigning a score to each force, ranging from 1 (weak) to 5 (strong). The last and the most important step is to discuss action strategies to reduce the 'restraining' forces and to capitalise on the 'driving' forces.

In terms of communication, there are a set of issues which seem to come out most clearly and which make a big difference to whether research is taken up. Communication is crucial in both directions: in terms of researchers listening to policy makers and in terms of engaging policy makers in the research right from the beginning and of keeping them involved or in touch with that process. The first step is to identify who you want to influence - the audience. A key aspect of this is to identify what you want them to do differently. Assess their specific information needs, likes and channels (official / unofficial, personal / impersonal and empirical data vs stories.) The second step is to clarify your messages - brevity, clarity, form, language. The third step is promotion - there are many ways, but interactive communication works best; seeing is believing; multiple formats / media are better than one.

Simon Maxwell, Director of ODI, has developed some ideas on policy entrepreneurship. He argues that policy entrepreneurship requires a wide range of skills. Researchers who want to be good policy entrepreneurs also need to be:

  • Storytellers: Practitioners, bureaucrats and policy makers often articulate and make sense of complex realities through simple stories. Though sometimes profoundly misleading there is no doubt that narratives are incredibly powerful.
  • Networkers: Policy making usually takes place within communities of people who know each other and interact. If you want to influence policy makers, you need to join their networks.
  • Engineers: There is often a huge gap between what politicians and policy makers say they are doing and what actually happens on the ground. Researchers need to work not just with the senior level policy makers, but also with the 'street-level bureaucrats'.
  • Fixers: Policy making is essentially a political process. Although you don't need to be a Rasputin or Machiavelli, successful policy entrepreneurs need to know how to operate in a political environment - when to make your pitch, to whom and how.

Try ODI's Policy Entrepreneur Questionnaire to find out which you tend to favour. You may want to develop new skills in these areas, or work with others who have these skills.

Finally, in terms of managing think tanks (the topic of a good book by OSI), the Director of IPPR, Matthew Taylor, talks about the three functions of a think tank which affect who and how they engage:

  • The solid function - to do substantial research and communicate core ideas to inform policy: the weighty research, publications, evidence, authority and independence
  • The liquid function - to facilitate the trickling-down of these ideas through government and partner institutions; policy formulation and implementation.
  • The gas function - to change awareness and attitudes in the environment; agenda setting; problem identification.

Why Network: The Functions of Policy Networks

As a lead to the Network discussion the following day, the discussion focused on the role of networks and the different types of roles that policy networks can have. Drawing on the work of Stephen Yeo (2004), there are six main types of network function, namely:

  • Filtering - networks provide an easy means of deciding what to pay attention to;
  • Amplifying - networks take a given message and present it in ways that allow it to be understood and absorbed more quickly and easily;
  • Investor / Provider - involves the provision of resources by networks, i.e. money to carry out research;
  • Facilitator - networks provide services which make it easier to do research;
  • Convening - networks provide a way to identify and bring together 'the right group of researchers' to plan and carry out a research project (networks network is perceived as 'the place to look' and 'the people to consult';
  • Communities - networks also play an important role in building and sustaining research communities (from ensuring research is done to setting standards).

The group identified the following ways the network could specifically help them:

  • Get funding - the investor / provider function
  • Convening
    • Sharing experiences and knowledge
    • Sharing success and failure experiences and cases
    • KM / knowledge sharing - lessons / contacts and training
    • Information resource
  • Personal contacts
  • New ideas from outside
  • Fresh outlook for your own business
  • Contribution to wider issues / projects - e.g. contribute to EU policy
  • Increasing legitimacy and impact of findings
  • Network research projects
  • Facilitate comparative research
  • Credibility of ideas that could be applied in the context of different countries
  • Benchmarking / peer review methodologies
  • Exchange of staff

Summary of Workshop Evaluation Findings

There are three main conclusions arising from the workshop evaluation. First, participants, in general, thought the topic was relevant and the workshop was of good quality. Secondly, most participants wanted more time to share experience on practical cases and to gather more cases from the region. Thirdly, there was particular demand for further training on policy / political assessment tools, policy influence tools and managing think tanks.

Downloads
Documentpdf
Documentpdf