4. What interventions, strategies and tools can strengthen capacity for EIPM?

The policy-making process is complex with many barriers and facilitators that affect the use of evidence. This chapter will map some of the barriers and facilitators to the use of evidence and will provide tools and descriptions of initiatives that address many of the barriers. The tools described below assist in improving each of the six core skills for policy-makers described in chapter 3. These tools work at both at an individual and organisational level to increase the use of evidence in the policy-making process.

Multiple dimensions of capacity affect the use of evidence in policy-making. Therefore, to build successful strategies, it is necessary to understand what are the barriers and facilitators to evidence-informed policy-making. Oliver et al. (2014[1]) in a systematic review (with 145 studies carried out it over 59 countries), found that timely access to high quality and relevant research, collaborations with policy-makers and relationship and skill building with policy-makers were reported to be the most important factors in influencing the use of evidence. Their research identified five categories, which encompass factors that can work as either facilitators or barriers depending on how they are managed (see Figure 4.1). The role of the various barriers and facilitators to evidence-informed policy-making will vary between contexts. The list below may not be exhaustive. The way in which specific barriers and facilitators operate, and how they interact with each other, will be context-specific.

Therefore, implementing any strategy and interventions to strengthen capacity for use of evidence, will require a first gap analysis and scan of the existing barriers and facilitators that prevail in a given national context.

Given the complexity of policy-making and the myriad of barriers that stand in the way of using evidence in policy-making, a range of interventions, strategies and tools are necessary. There is no one strategy that can be identified as superior to others in building the capacity of policy-makers to use evidence, rather a combination of strategies that are responsive to the dynamic contextual conditions of each jurisdiction is likely to be most effective in promoting sustainable change.

Although most capacity building initiatives are multifaceted, involving more than one strategy and aiming at more than one outcome, it is nevertheless possible to draw broad distinctions between different types of initiatives and to map these onto the skills framework presented in the previous chapter, which was jointly designed, by the OECD and the JRC. A summary of this mapping is presented in Figure 4.21. A more detailed mapping of interventions, strategies and tools onto the skills framework for EIPM in presented in Annex A.

There is great potential for the tools listed in this chapter to be used more widely to provide a formal needs analysis of current capacity for EIPM, which then facilitates consultative approaches to choosing a range of interventions that are tailored to the local needs and context. For example, a recent review of interventions found that whilst many interventions were described as ‘tailored’ only a minority had actually used formal needs analysis to shape the intervention to meet local needs (Haynes et al., 2018[2]).

Beyond these specific initiatives, it is also critical to understand how to embed these approaches within organisational structures and systems. This can be done through systemic or organisational approaches, which are analysed in the following chapter.

A key first step in any capacity building programme is to promote understanding. This includes understanding if there is a desire for change and finding out which of the core processes of policy-making support or hinder an evidence-informed approach. Without such an understanding, government agencies risk investing in strategies that are poorly matched to their needs and therefore wasting opportunities to enhance their use of evidence. These tools attempt to understand the existing capability, motivation and opportunity to use evidence within the system.

There is great potential for these tools to be used more widely to provide a formal needs analysis of current capacity for EIPM, which then facilitates consultative approaches to choosing an approach range of interventions that are tailored to the local needs and context. For example, a recent review of interventions found that whilst many interventions were described as ‘tailored’ only a minority had actually used formal needs analysis to shape the intervention to meet local needs (Haynes et al., 2018[2]).

Australia has developed a number of tools to measure policy-makers’ capacity to engage with and use research. The Seeking, Engaging with and Evaluating Research (SEER) is a tool to measure individual policy-maker’s capacity to engage with and use research (Brennan et al., 2017[3]). SEER is envisaged as a practical tool that can help policy agencies that want to assess and develop their capacity for using research, as well as a tool to evaluate the success of initiatives designed to improve evidence use in policy-making.

SEER uses a questionnaire consisting of 50 questions and is broken into three categories of assessment to identify areas for improvement in the use of research. The first category, capacity, measures whether an individual has the motivation and capability to engage with research and researchers. Research Engagement Actions measures the systematic process for engaging with research, including actions that are likely to be precursors to the use of research. Research Use measures the extent and way in which research is used to inform the different stages of policy or programme development. Policy agencies can use this tool by having their policy-makers fill out the questionnaire. Policy agencies will then be able to determine which areas to focus on to improve the use of evidence by policy-makers. For more details on each category see Figure 4.3.

The Staff Assessment of engagement with Evidence (SAGE) is a further tool that has been developed in Australia and aims to provide a thorough evaluation of current levels of research engagement and use; and it also works to inform interventions to improve research capacity and evaluate their success (R. Makkar et al., 2016[4]; Makkar et al., 2017[5]). SAGE combines an interview and document analysis to concretely assess how policy-makers engaged with research, how research was used and what barriers impacted the use of research in relation to a specific policy product. Following promising preliminary testing of the tool’s reliability and validity (Makkar et al., 2017[5]) it would be possible to train agency staff to use SAGE to assess research use within their agencies. This would help to inform the type of tools, systems and structures agencies could invest in to improve how staff use evidence and also to evaluate the effectiveness of these tools, systems and structures.

Policy-makers cannot use research evidence if they do not know about it (Haynes et al., 2018[2]). Therefore, strategies that increase access to clearly presented research findings is one promising approach to increase research use. These strategies include: providing access to research through online databases; disseminating tailored syntheses of research evidence; commissioning research and reviews; seminars to present research findings; and access to knowledge brokers. Mapping these interventions onto the OECD/JRC framework, they all share a core focus of improving policy-makers ability to obtain evidence. In terms of the behaviour change model, these interventions are generally trying to increase opportunities for policy-makers to use evidence.

Evaluations of such initiatives show that, in general, tailored and contextualised syntheses, seminars and advice from knowledge brokers and researchers seem to be the most promising means of improving access to research. Overall, research suggests that in isolation, improved access alone does not significantly improve evidence-informed policy-making (Haynes et al., 2018[2]; Langer, Tripney and Gough, 2016[6]; Dobbins et al., 2009[7]). In contrast, interventions facilitating access have been found to be effective when the intervention simultaneously tries to enhance policy-makers’ opportunity and motivation to use evidence (Langer, Tripney and Gough, 2016[6]; Haynes et al., 2018[2]).

Providing policy-makers with access to research articles or syntheses via an online database aims to maximise access to specific types of research and increase policy-makers’ confidence in accessing and using such content.

The Campbell Collaboration and The Cochrane Library, established 20 years ago, aim to improve the quality of healthcare policy-making. Cochrane is a global independent network of researchers, professionals, patients, carers and people interested in health. The Cochrane Library contains systematic reviews of medical and healthcare interventions. The Campbell Collaboration promotes positive social and economic change through the production and use of systematic reviews and other evidence synthesis for evidence-informed policy and practice. The Campbell Library covers the following areas: crime and justice, disability, education, international development, knowledge translation and implementation, nutrition and social welfare.

One approach to increasing policy-makers’ motivation to use evidence is to tailor it to their needs. In general, the increased ease of access and use of evidence resulting when evidence was synthesised, tailored for specific users and sent directly to them, facilitated uptake of evidence use by policy-makers (Haynes et al., 2018[2]). An example of this are contextualised and individualised evidence briefs (Langer, Tripney and Gough, 2016[6]; Haynes et al., 2018[2]).

The WHO launched a programme to support evidence-informed policy-making in a number of low- and middle-income countries (Shroff et al., 2015[8]). In Argentina, the programme focused on the production of policy briefs on health research and holding policy dialogues. OECD’s work on knowledge brokering institutions, also shows that policy briefs are common tools for trying to disseminate research to policy-makers (OECD, 2018[9]). For example, the UK What Works Centre, including Education Endowment Foundation, the Early Intervention and the What Works Centre for Local Economic Growth produce a range of policy briefs to disseminate key messages to its target audience.

Tailored research products are perceived as credible, useful and likely to impact decision-making and seem to add value over and above simply providing access to the primary research.

While many organisations do produce policy briefs to disseminate their research, practices are also influenced by social media. Policy briefs can also be replaced by “information nuggets”, and parts of storytelling that can be disseminated through social media accounts, to spread the main messages of key policy and evaluation reports. While these may limit the substantive content of what is actually disseminated, it is also believed to be a way to increase the impact in a wider sense. If citizens are aware of the results and wary of the implications, it will also build pressure on the policy-makers to pay attention to the results and ensure that they feed into policy-making.

Policy-makers commissioning research and reviews of research is hypothesised to increase their engagement with and control of the research which in turn would increase the relevance and applicability of the research to policy-making (Haynes et al., 2018[2]).

In Australia, the ‘Evidence Check’ programme was developed to assist Australian policy-makers in commissioning high-quality reviews of research to inform policy decisions. The programme involved an iterative process of knowledge brokering in order to formulate and refine the scope of and questions addressed by the review (Campbell et al., 2011[10]). In the UK, the Department of Health and Social Care developed a ‘Policy Reviews Facility’ to support national policy development and implementation which has existed in some form since 1995 (EPPI Centre, 2016[11]). Policy teams, government analysts and academic experts from three universities (University College London, University of York and London School and Hygiene and Tropical Medicine) work closely together to determine the focus of systematic review products to best meet the needs of policy work. In the US, the OMB has developed grant review and support structures to assess the quality of evidence being commissioned.

Commissioned research facilitated by a knowledge broker has been found to be useful and accurate by the policy-makers who commissioned them. They were mostly used in indirect ways, such as informing policy deliberations and providing background information (Haynes et al., 2018[2]). Feedback from policy-makers and researchers on the Evidence Check suggested that the use of knowledge brokers enhanced the value of reviews commissioned (Campbell et al., 2011[10]).

Seminars can provide policy relevant accessible content, the success of which seems to be enhanced by the credibility and communication skills of the presenter. It is also hypothesized that such meetings can be a useful way of bringing policy-makers and researchers together, breaking the ice which could lead to further interactions.

The Joint Research Centre at the European Commission has held a lunchtime science lecture series for a number of years. The seminars feature JRC scientists and researchers, as well as external guest speakers. The seminars are web streamed and announced carousel on the JRC’s homepage and twitter account (EU Science Hub, 2019[12]).

Seminars are generally well received by attendees and preferred to reading reports. However, results are less positive concerning the ability of seminars in isolation to lead to behaviour change and impact on policy-making (Haynes et al., 2018[2]; Langer, Tripney and Gough, 2016[6]).

Knowledge brokers can help to facilitate policy-makers’ access to research evidence by helping them to navigate research material that may be unfamiliar. They can also help to articulate policy-maker’s needs, constraints and expectations, translating them for researchers who may be unfamiliar with the policy process. Factors that facilitate the success of knowledge brokers include their interpersonal skills, ability to provide individualised support and perceived neutrality (Haynes et al., 2018[2]).

Knowledge brokers can include individual professionals and dedicated organisations. Government Chief Science Advisors are one example of individual knowledge brokers present in some countries. In terms of institutions, some are specifically connected to knowledge producers, such as brokering units within academic institutions (Kauffeld-Monz and Fritsch, 2013[13]). Examples of such organizations are the Centre for Evaluation and Analysis of Public Policies in Poland and the Top Institute of Evidence-Based Education Research in the Netherlands. Other approaches locate the knowledge broker function closer the decision makers, either in a body at arm’s length from government or within a relevant agency itself. Examples of this approach include activities carried out by the Australian Institute for Family Studies (AIFS) and the Research and Evaluation Unit Department of Children and Youth Affair in Ireland. In France, many of these knowledge brokerage functions are integrated within the ministries, with the analytical units in the Ministries of Labor (DARES), Social Affairs (DREES), or the Environment (DEEE), providing strategic advice and access to evidence, integrating the knowledge broker functions within the day-to-day activities of the ministries.

Evaluations of knowledge brokering activities have found that it is regarded as helpful and preferable to training and other tools, although not all studies have found an added value of knowledge brokering compared to other activities. Other studies have shown that working with a knowledge broker can increase policy-makers’ knowledge and skills in finding, appraising and using evidence, leading to increased engagement in evidence based policy-making (Haynes et al., 2018[2]).

Stimulating demand for research requires significant behaviour change from individuals working in policy-making, and these changes are unlikely to be achieved in a single training workshop, especially if the workshop is delivered in a didactic manner (Newman, Fisher and Shaxson, 2012[14]). Nevertheless, this does not mean that there is no role for training initiatives in building capacity for EIPM. Although there are differences between these in the goals of different initiatives, many are focused specifically on improving individual capability to use research.

Existing country practice reveals a wide range and approaches towards skills development interventions. This includes both training designed to encourage managers such as the Senior Civil Service to become champions of research use (as well as more intensive skills training programmes for policy professionals. Senior Civil Service leadership training is primarily aimed at increasing managers’ understanding of EIPM, enabling them to become champions for evidence use. Intensive skills training programmes vary in content and format but can be focused on interrogating and assessing evidence and also on using and applying it in policy-making.

Factors that lead to successful training include having participant-driven learning, active input and a strengths-based approach that motivates and assures policy-makers of their abilities. It is also critical that there is strong leadership support for attendance at such training, which is indicative of the wider organisational commitment to the use of research. Shorter more intensive programmes have been associated with increased retention. Some studies have also found that training can be undermined by high staff turnover and conflicting work pressures. There can also be a trade-off between the intensity of training and policy-makers ability to attend, especially with senior policy-makers who have a critical role to play in championing change (Haynes et al., 2018[2]).

OECD’s work on how to engage public employees for a high performing civil service highlights the importance of learning and training in a modern civil service, to enable civil servants to continually update their skills and capacity to innovate (OECD, 2016[15]). There is a strong justification for investment in learning and training and there is also a strong call from employers and employees for the need to invest in skill and competency development. Therefore, rigorous evaluation of initiatives is critical in order to invest in the most cost-efficient interventions, without comprising their effectiveness.

Training managers, such as the Senior Civil Service (SCS) on research evidence use can create a shift in work culture and increase the use of evidence within their team. Through training, managers can learn to foster an environment that enables and promotes the use of evidence in policy-making.

In Canada, the Executive Training in Research Application (EXTRA) programme provides support and development for leaders in using research. The programme is targeted at leaders in the healthcare field. The programme’s objectives are that after the completion of the training, participants will be able to use evidence in their policy-making and will be able to train their co-workers and bring about organizational change. Finland also has developed public sector leadership training, which is described in Box 4.1.

The Portuguese government also recognises the importance of the Senior Civil Service’s role in maintaining a focus on performance and results. On an annual basis, all public service organisations inform the National Institute of Public Administration of employees’ training needs, which then feeds into the development of an annual training programme (OECD, 2016[17]). Alongside this, the government has identified new competencies for the public management of complex policy challenges. This includes a focus on performance and results, innovation, communication as well as core management and leadership skills.

In Mexico, academic institutions and non-governmental organisations have been instrumental in promoting a culture of evidence-informed policy-making by developing the next generation of the senior civil service as champions of an evidence-informed approach. As a specific example, IREFAM, a private institution offering graduate studies to mental health professionals in the state of Chihuahua, altered the content of its masters and doctoral programmes in collaboration with the University of Texas at Austin, to include material on evidence-based prevention interventions with a specific focus on cultural adaptation. Since 2007, over 500 master’s students have been trained in these approaches, with many now occupying leadership positions in state government thus enabling them to actively promote the implementation of evidence-based prevention interventions (Parra-Cardona et al., 2018[18]).

National Schools of Government also play an important role in civil service skills and knowledge development. According to OECD research, schools are principally involved in activities that are related to training and professional development activities, such as organising conferences, integrity and value training, and management and leadership development. Examples of the missions and mandates of three schools can be found in Box 4.2.

Training programmes geared towards policy-makers can provide them with the necessary skills to increase the use of evidence in their work. Training programmes can be very effective when they are learner-centred and participatory, ideally embedded within long-term strategies for professional development (Newman, Fisher and Shaxson, 2012[14]).

Creating learner-focused programmes can include tailoring content to individual needs, informal information exchange and practice opportunities. Through trainings, policy-makers not only learn a new skill but often also have increased motivation to use evidence and many become research champions and train or mentor others (Haynes et al., 2018[2]).

In the UK, the Alliance for Useful Evidence has an Evidence Masterclass where policy-makers can learn about how to use evidence in their policy work and can practice their new skills through simulations. Through this programme, policy-makers are able to build their confidence in compiling, assimilating, distilling, interpreting and presenting evidence. Participants learn how to find research that is relevant to their policy question and develop their ability to assess of quality and trustworthiness of research. In Sweden, the Swedish Agency for Health Technology Assessment and Assessment of Social Services also has a training programme for evidence use in policy-making.

Mexico has also implemented capacity-building initiatives towards Regulatory Impact Assessment. Regulatory Impact Analysis (RIA) is a systemic approach to critically assessing the positive and negative effects of proposed and existing regulations and non-regulatory alternatives. Conducting RIA can underpin the capacity of governments to ensure that regulations are efficient and effective. For example, training seminars were held by Mexico’s Ministry of the Economy, for Federal and Provincial officials on how to draft and implement Regulatory Impact Assessments (RIA). The learning programme provided a step-by-step methodology on how to produce and analyse impact assessments in practice using guidance, case studies and advice from peer government officials, experts and the OECD. (Adapted from OECD (2012[20]).

Poland is another country that has implemented capacity-building initiatives geared towards RIA in the form of an Academy for Regulatory Impact Assessment initiated by the Chancellery of the Prime Minister. The basic aim of the academy is to develop competencies and skills of the civil servants responsible for the preparation of legal acts. As part of the project, participants are offered: As part of the project, participants are offered: 1) post-graduate studies in the field of regulatory impact assessment and public consultations 2) long-term specialist trainings on public consultations 3) continuous training in the field of the application of analytical techniques as part of the impact assessment (regulatory impact assessments and public consultations).

Training programs are also a key function for advisory bodies and international networks to improve the science to policy interface and contribute to evidence-informed policy-making. For instance, The International Network for Government Science Advice (INGSA) works as a collaborative platform for policy exchange, capacity building and research across diverse global science advisory organisations and national systems (See Box 4.3).

The European Commission Joint Research Centre’s (JRC) launched in 2018 a pilot initiative called “Science meets Parliaments/Science meets Regions”, involving the organisation of events in 22 member states, bringing together scientists, policy-makers as well as businesses and civil society organisations in order to promote evidence-informed policies on specific topics of local concern. The project involves targeted studies on these topics commissioned by the authorities involved, and a series of training courses on EIPM skills for policy-makers bringing together policy-makers and suppliers of evidence”. See Box 4.4.

Evaluations suggest that training workshops can be a useful starting point for developing individual capacity, so long as they are appropriately tailored and allow active input from participants (Haynes et al., 2018[2]). Whilst workshops are generally well received by participants and lead to self-reported increases in knowledge skills and confidence, in isolation they are unlikely to lead to long term change in practice (Taylor et al., 2004[24]; Rushmer, Hunter and Steven, 2014[25]; Haynes et al., 2018[2]).

A recent systematic review presents conclusions on the effectiveness of interventions to build skills for EIPM. The systematic review covers seven interventions in which skill development was the sole mechanism used to improve evidence use and a further fifteen multi-mechanism interventions in which skills development was one component. The following conclusions were reached regarding the impact of the skill development interventions (Langer, Tripney and Gough, 2016[6]):

  • Skills development interventions were found to be effective in increasing evidence use if both capacity and motivation to use evidence improved.

  • Skill development interventions built capacity in reliable ways, especially if embedded in an educational programme focused on teaching critical appraisal skills.

  • Skill development interventions increase motivation to use evidence even without explicitly targeting it.

  • Skill development was found not to be effective in multi-mechanism interventions if the educational component is diluted and only passively affected in the combined programme.

  • Skill development was found to be effective in combination with interventions to embed evidence-informed policy-making skills into organisational processes, resulting in increased motivation and opportunity to use evidence.

Mentoring is another approach, which can be used to support individual capacity building. Mentoring is hypothesised to work by giving personalised guidance in relation to ‘real-world’ application of knowledge (Haynes et al., 2018[2]; Newman, Fisher and Shaxson, 2012[14]). The success of mentoring is facilitated by a number of factors. This includes ensuring it is project and person specific and enabling the policy-makers to develop tangible skills they can directly apply to their work. The credibility of mentors is also an important factor, which can be engendered by applied expertise and strong interpersonal skills. It is also important that participants in the mentoring process are accountable for striving to integrate new skills and by being given the opportunity to demonstrate competence such through presenting their work or by having it assessed (Haynes et al., 2018[2]).

Evaluations that have used mentoring as a component have found a number of effects both in terms of process and outcomes. A review of studies found that in general evaluations find that mentoring leads to self-reported increases in skills and confidence, and participants tend to apply the skills they have learnt in practice. Mentoring can also lead to improved relationships with researchers and a strengthened culture of continuous learning. There is also evidence that mentoring may provide the greatest support to staff who are less integrated into the workforce, such as new employees who may lack confidence in using research skills (Haynes et al., 2018[2]).

Policy-makers and professionals are more likely to seek and use research obtained from trusted familiar individuals rather than from formal sources (Oliver et al., 2015[27]; Haynes et al., 2012[28]). Therefore, the different strategies for interaction discussed in the following sections can help to build trusted relationships and increase the opportunities for research to impact policy-making. These approaches include one-off or periodic forums, various platforms for ongoing interactivity and more intensive partnership projects.

Improved engagement between policy-makers and evidence producers, especially when this accomplished in a positive way, can act as a ‘virtuous circle’ by increasing trust and confidence between the two parties and increasing capacity for shared understanding and collaboration. One-off or periodic forums are generally received by attendees who self-report ‘broadened knowledge’, but attendance can be uneven, with difficulties engaging senior policy-makers (Haynes et al., 2018[2]). Platforms for ongoing interactivity can help to establish more trusting and equal partnerships between researchers and policy-makers. However, whilst such activities are valued by participants, there can be poor awareness of the purpose and resources available in some of the programmes. Nevertheless, some studies have found self-reported increases in understanding of research use for policy-making.

Interventions and approaches that bring together policy-makers and researchers include one-off or periodic seminars or forums, such as roundtables, cross-sector retreats and policy dialogues (Haynes et al., 2018[2]). These approaches aim to build mutual interest, trust, respect as well as promoting learning about each other’s values, contexts, constraints and practices.

At a European level, the Joint Research Centre of the European Commission has organised an ‘Evidence and policy summer school’ for a number of years. The summer school aims to help junior to mid-career researchers to have more impact and policy-makers to use evidence for policy solutions. The summer school focuses on the tools and approaches to inform the policy-making process through evidence.

In Australia, engagement between policy-makers and researchers has been promoted through the use of ‘Policy Roundtables’ described in Box 4.6.

Platforms for ongoing interactivity can include communities of practice, formal networks and cross-sector committees. The rationale for such initiatives is that repeated face-to-face contact permits the development of trust, respect and ease of communication. Genuine and sustained collaboration can also increase ownership and investment in the research and dissemination process (Haynes et al., 2018[2]).

More intensive platforms for ongoing activity include the Policy Liaison Initiative for improving the use of Cochrane systematic reviews (Brennan et al., 2016[31]). This involved creating an ‘Evidence-Based Policy Network’ to facilitate knowledge sharing between policy-makers and researchers, alongside seminars by national and international researchers in the field of evidence synthesis and implementation (see Box 4.7). Poland also has experience in developing platforms for interaction between researchers and policy-makers (Box 4.8).

In 2009 the Preventing Violence Across the Lifespan (PreVAil) was established as an integrated knowledge translation network to support effective partnerships between its members as well as joint research and application in the area of family violence prevention (Kothari, Sibbald and Wathen, 2014[32]). PreVAil is internal in scope involving 60 researchers and knowledge users from Asia, Australia, Canada, Europe, the UK and the US. PreVAil’s approach includes knowledge generation, dissemination and utilisation. The majority of funding, provided by the Canadian Federal Government is used to support attendance at meetings as well as knowledge translation specific activities (Kothari, Sibbald and Wathen, 2014[32]).

Partnership projects include various schemes to bring policy-makers into contact with individual scientists, through collaborating in the development of research projects as well as ad-hoc or formalised systems of parliamentary advice where researchers are called to provide advice.

In 2015, the UK Cabinet Office set up the ‘Cross-Government Trial Advice Panel’ in partnership with the Economic and Social Research Council. The Trial Advice Panel brings together a team of experts from academia and within the civil service to support the use of experiments in public policy (What Works Network, 2018[33]). The Panel offers the opportunity of sharing expertise, allowing departments with limited knowledge in evaluation to work with departments that do, as well as with top academic experts. In so doing, the Trial Advice Panel aims to reduce the barriers that departments face in commissioning, conducting evaluations, and using the resulting evidence to improve public policies.

The UK has also created a programme that pairs academics and Members of Parliament, described in Box 4.9. In addition, the Open Innovation Team in the Cabinet Office that pairs academics with civil service teams.

The Australian Institute of Family Studies (AIFS) is the government body responsible for the delivery high quality, policy-relevant research on families’ wellbeing. AIFS has developed an ‘Expert Panel’ supporting practitioners in delivering high-quality services for the end-users (Robinson, 2017[34]). The panel gathers experts in research, practice and evaluation, who serve as advisors and facilitators. These experts support practitioners in implementing a policy by measuring outcomes, trying new policy approaches, and conducting research and evaluations.

In Finland, a ‘Hack for Society’ brings together academics, NGOs as well as national and local government to develop co-creative teams to work on service design, co-creation and societal trials. The goals are to simultaneously strengthen the understanding of different professional roles whilst tackling complex contemporary policy challenges (SITRA, 2017[35]). The Netherlands has also developed an initiative to bring academics into partnership with policy-makers (see Box 4.10).

The US also has a range of different partnership projects in different policy areas. The National Poverty Research Center is a partnership between the US Department of Health and Human Services and the University of Wisconsin-Madison. The Center provides research, training and dissemination to inform policy and practice. The Center creates a space for extensive collaboration among researchers, policy-makers and practitioners (University of Wisconsin-Madison, 2019[37]). Another example of a partnership project is the Quality Enhancement Research Initiative, which includes a policy resource center. The Center provides timely, rigorous data analysis to the government to support the development of policy. The Center brings together stakeholders including practitioners, researchers, policy-makers, service users and the general public (U.S. Department of Veterans Affairs, 2017[38]).  

Evidence-informed policy-making is more likely to occur if organisations have a culture that promotes and values research use and that invests in resources that facilitate staff engagement with research (Makkar et al., 2015[39]; Makkar et al., 2018[40]). Therefore, measures of organisational research use culture and capacity are needed to identifying strengths, areas for improvement and assess the impact of capacity building initiatives. A first step in enabling organisations to increase their ability to identify and assess research and use it in policy-making is to examine the existing organisational capacity to access, interpret and use research findings (Kothari et al., 2009[41]). Furthermore, having tools to assess organisational capacity helps to understand what it is about some agencies or departments that leads them to cultivate and embrace evidence-informed policy-making (Hall and Van Ryzin, 2018[42]).

These tools were less numerous but are designed to promote the capacity to evaluate evidence within the public sector. For example, Canada’s Evidence Literacy diagnostic tool is a self-assessment tool to enable service managers and policy organisations to help them understand their capacity to acquire assess, adapt and apply research (Kothari et al., 2009[41]). The tool is organised into four general areas, with a number of questions addressing performance in each area (see Box 4.11). The tool is envisaged as a catalyst for discussions about research use, thus encouraging and supporting EIPM.

The US developed tool called ‘Norm of Evidence and Research in Decision-making’ (NERD) that can be used across organizational and functional settings to assess evidence based management practices within an agency (Hall and Van Ryzin, 2018[42]).. Its development was motivated by the thought that organisations seeking to use evidence in policy-making must be aware of their organisation’s norms of evidence use, and differences that may exist across divisions, in order to be most effective. In terms of its practical application in policy organisations, NERD could be used to make staffing decisions to improve person-organisation fit agency (Hall and Van Ryzin, 2018[42]).

The Organisational Research Access, Culture and Leadership (ORACLe) tool is a theory-based measure of organisational capacity to engage with and use research in policy development (Makkar et al., 2015[39]). ORACLe assesses multiple dimensions of organisational capacity including the systems, supports and tools that organisations have in place to use research, as well as the values placed on research within an organisation. It is administered as a structured interview with organisational leaders (see Box 4.12). A key advantage and use of the scoring system produced is that it enables organisations to identify specific areas for development and determines their strategic importance. (Makkar et al., 2017[5]).

The use of evidence is intimately linked to organisational structures and systems. Undertaking changes to improve use therefore requires reflection on where evidence advice can enter the system and how strong or well-integrated evidence structures should be (Parkhurst, 2017[43]). These considerations introduce complex human dynamics that need to be considered in the development and implementation of strategies, including organisational culture and the nature and quality of communication within the organisation (Damschroder et al., 2009[44]). This suggests that the participatory development of organisational and system level interventions may offer the best chance of success (Haynes et al., 2018[2]).

The hypothesised mechanisms of changes to organisational structures and systems are complex and manifold cutting across the full range of skill competencies identified in the OECD/JRC framework. For example, organisational systems both serve the delivery of organisations’ routine practices but also signal their values. Workforce development can help to provide further opportunities for staff and incentives which can be motivating. The creation of in-house research roles and other resources also signal managerial commitment to research use. Although the different organisational improvements discussed in this chapter have different purposes, they can all help to embed research use and drive a culture of evidence use in policy organisations. In terms of the COM-B model, these organisational factors can be understood as providing the incentives, which motivate the individual to use evidence (or not). Organisational factors also enhance or constrain opportunities for individuals to use evidence. A summary of the organisational initiatives is presented in Box 4.13 with full details of the initiatives in the Appendix.

A range of organisational tools, resources and processes have been implemented to facilitate the use of research within policy organisations. These include toolkits, knowledge management protocols, organisational strategies and evaluation frameworks, and dedicated funds for commissioning research.

The New Zealand Policy Project was launched in 2014 to improve the quality of policy advice being produced across government agencies (Washington and Mintrom, 2018[45]). It deployed policy analytic tools and frameworks to investigate current practice in policy design to improve the quality of policy advice across the whole of government. A key aim was to ensure that policy advice was developed on the basis of the best available evidence and insights, including an understanding of ‘what works’. This included developing a ‘Policy Methods Toolbox’, which is a repository of policy development methods that helps policy practitioners identify and select the right approach for their policy initiative Box 4.15.

In Germany, the federal government began a systematic evaluation of all major regulatory instruments at the end of 2014 as part of their Programme of Work for Better Regulation. The implementation of systemic evaluations aimed to strengthen performance management by evaluating the effectiveness of programmes at achieving their intended goals. The reforms were designed to enable the government to identify what works and what does not (OECD, 2016[17]).

In Ireland, the Research and Evaluation Unit in the Department for Children and Youth Affairs developed the ‘Evidence into Policy Programme’, which aims to support governmental policy priorities through research and knowledge transfer activities to promote the uptake and use of evidence to drive policy change (Box 4.14).

In the UK, the Department for Environment, Food and Rural Affairs (DEFRA) has published a number of iterations of an Evidence Investment Strategy, which aims to embed an evidence-informed strategy across the department and the wider sector (Shaxson, 2019[46]; DEFRA, 2010[47]). The Evidence Investment Strategy sets out DEFRA’s priorities for sourcing evidence, the aims of its evidence work, the evidence needs across the organisation and describes a framework that DEFRA uses to allocate its evidence resources. Underpinning this is a strategy to retain capabilities such as infrastructure, networks, staff and expertise, and data to enable the department to respond to emergencies, alongside crosscutting capabilities.

In Canada, the province of Ontario has been at the forefront of developing organisational initiatives to improve the quality of evidence-informed policy-making in the field of public health. One of these initiatives is described in Box 4.16.

The OECD is developing a comprehensive view of the institutionalisation and use of policy evaluation across OECD countries, based on a survey of policy evaluation (Beuselinck et al., 2018[50]; OECD, 2018[51]). The institutionalisation of policy evaluation includes the enactment and implementation of regulations, evaluation policies, as well as the specific institutional arrangements within the government (Jacob, Speer and Furubo, 2015[52]). Given that the underpinning rationale for policy evaluation different among countries, so does the approach to institutionalisation. Some countries, such as France and Switzerland, have the use of evaluations embedded in their constitutions. Other countries, such as Austria, Germany and the United States, have framed evaluation as part of larger public management reforms. Furthermore, several countries have recently introduced - or are currently in the process of introducing - important changes to their institutional set-up and/or underpinning legal and policy framework, some of which are described in the sections that follow.

The OECD’s Observatory for Public Sector Innovation (OPSI) has created a Toolkit Navigator that contains a wide variety of tools for public sector innovation and transformation. OPSI created this database because they found that there are a plethora of free innovation toolkits and guides that exist to help people identify, develop and practice necessary skills and apply new ways of reaching an outcome. So OPSI created the Navigator to help people easily find the tools they need. Within this database, there are many evidence-informed policy-making toolkits that also address the needs of policy-makers wanting to increase their use of evidence in their work. The Quality of Administration Toolbox is an important tool developed at European level by the Joint Research Centre (see Box 4.17).

INASP has also created a toolkit for evidence-informed policy-making. INASP is an organisation that works with a global network of partners to strengthen the capacity of individuals and institutions to produce, share and use research and knowledge, in support of national development. The Toolkit focuses on building skills in finding, evaluating and communicating evidence as well as developing practical implementation plans. The Toolkit is designed as a training programme that includes a trainers’ manual, handouts, activities, presentations and readings (INASP, 2018[53]). Based on work done with the South African Department of Environmental Affairs, the Overseas Development Institute (ODI) has produced guidelines and good practices for evidence-informed policy-making in a government department, which are designed to underpin a systematic approach to improving EIPM within a government department. (Wills, 2016[54]).

In recognition of the limitations in infrastructure and capacity to support the use of evidence, some jurisdictions have launched initiatives to try and maximise the use of government’s existing assets for EIPM. This can include library facilities, research portals and clearinghouses as well as data sharing software and other methods of maximising government’s data assets.

In the US, the Foundations for Evidence-Based Policy-making Act was designed to ensure that the necessary data quality and review structures were in place to support the use of administrative data in evaluations (see Box 4.18).

The OECD has been working on projects focused on creating a data-driven culture in the public sector in order to better use data to support policy-making, and service design and delivery. These have been prepared in the area of digital government, budgeting and integrity.

In the US, the Government has made a commitment to open data and data governance. The Government has a number of initiatives to facilitate that including the website Data.org that provides data to the public and features over 188 000 datasets on topics such as education, public safety, health care, energy and agriculture. To assist agencies in their open data efforts and to support the Federal open data ecosystem, the Administration has also built additional resources such as Project Open Data, which provides agencies with tools and best practices to make their data publicly available, and the Project Open Data Dashboard, which is used to provide the public a quarterly evaluation of agency open data progress. With these data available, the public is able to assess the work of the government agencies, compare the impact of programmes and hold government accountable (OECD, 2016[17]). These initiatives are further supported by the Foundations for Evidence-Based Policy-making, which assists agencies to increase their capacity to generate and use evidence in their policy-making.

A number of governments across the OECD have appointed Chief Scientific Advisors to support the government in ensuring that the systems are in place within the government for managing and using scientific research. In the UK, the Government Chief Science Advisor’s (GCSA) role is to advise the Prime Minister and Cabinet on science, engineering and technology. The GCSA reports directly to the Cabinet Secretary and works closely with the Science Minister, and other ministers and permanent secretaries across Whitehall. In addition, the majority of UK government departments also have Department Chief Scientific Advisers (DCSA’s). The DCSA works alongside the government analytical professions, ministers and the policy profession to ensure that evidence is at the core of decisions made within the department. This can include the provision of advice directly to the secretary of state and oversight of the systems for ensuring that policy-makers consider relevant evidence in policy-making (Government Office for Science, 2015[56]).

New Zealand is another OECD country that has both a GCSE role, alongside DCSA roles. For example, the Ministry for Social Development has a DCSA who works to improve the use of evidence in policy development and advice. Ireland and Australia also both have GCSA role. In Australia, the GCSA also holds the position of Executive Officer of the Commonwealth Science Council to identify challenges and opportunities for Australia that can be addressed using science. They also advocate for Australian science internationally and are a key communicator of science to the general public, with the aim to promote understanding of, contribution to and enjoyment of science and evidence-based thinking.

A number of OECD countries have established dedicated teams to champion the developing and evaluation of new approaches to public sector delivery, whilst ensuring that the government has the skills and capacity to use the evidence that is generated. In the UK, a dedicated team within the Cabinet Office supports the government’s ‘What Works Approach’ (see Box 4.19). In the US, a dedicated Evidence Team within the Office of Management and Budget (OMB) acts a central hub of expertise across the federal government, working with other OMB offices in order to set research priorities and ensure the use of appropriate evaluation methodologies in federal evaluations. The Evidence Team also works actively to ensure the findings from research and other forms of evidence are used in policy design, by developing agency capacity to generate and use evidence and providing technical assistance and other initiatives to a wide range of Federal agencies and functions. Complementing the work of OMB, the Office of Evaluation Sciences in the General Services Administration that works across the federal government to support trials and impact evaluations.

The Italian government has also sought to strengthen accountability through better performance management and evaluation. The Office for the Programme of Government of the Prime Minister’s Office monitors and assesses progress on the implementation of the Government programme (OECD, 2016[17]). When a monitoring exercise shows challenges towards achieving a particular goal, the Office offers support and encouragement to the relevant administration. In Austria, the Federal Performance Management Office plays a comparable role in strengthening accountability through performance management and evaluation (Box 4.20).

In Korea, the Government Performance Evaluation Committee was established in 2013 as part of the Office for Government Policy Co-ordination (OPC) inside the Prime Minister’s Secretariat. The goal of the committee is to evaluate the policies of central government agencies on an annual basis. The Government Performance Evaluation system was established by the OPC to focus government efforts on resolving issues concerning the Presidential Agenda in a timely fashion (OECD, 2016[17]).

The Japanese government has implemented a number of initiatives to improve the execution and use of policy evaluation across government (Box 4.21).

In Spain, the State Agency for the Evaluation of Public Policy and Service Quality (AEVAL) was created in 2007, as part of the Office of the State Secretary of Public Administration to evaluate the ministries in the government through reports on quality management activities on an annual basis. AEVAL also pooled the best practices across the government into a repertoire of good practices that ministries could use as benchmarks to compare themselves against. In the same way, AEVAL performed an evaluation of public policies commissioned by the Government. Following a change of government in 2017, AEVAL was disbanded with the General Directorate of Public Governance (within the Secretary of State of Civil Service) taking ownership of the evaluation of the quality of the services, whereas the new Institute for the Evaluation of Public Policies (IEPP) was created as a renewed commitment of the Government in matters of evaluation of public policies.

The Institute for the Evaluation of Public Policies is working on the development of methodologies for good practices in evaluation through the drafting of guides that will serve as support to the different agencies to carry out evaluations. In addition, this body promotes training for public employees thanks to a training plan on evaluation and specific formative actions in different levels of Government. The Institute advises on evaluability of the plans and programs during the planning stage and it is in charge of several strategic plans to carry out the ex-post evaluation.

Mexico has a similar body called the National Council for the Evaluation for Social Development Policy (CONEVAL), which is a decentralised public body of the Federal Public Administration. CONEVAL has the autonomy and technical capacity to generate objective information and evaluations of social policy, which then feeds back into the policy-making process to foster better policies.

Colombia has also created a National Monitoring and Evaluation system, described in Box 4.22.

Other central initiatives have been created in response to calls for greater experimentation, whilst also often encompassing wider issues of evidence generation and use within government. In Canada, the government introduced a commitment to devote a fixed percentage of programme funds towards innovation (Government of Canada, 2016[57]) which was part of the government’s overall focus on evidence based policy-making; results and delivery (see Box 4.23). This drive is supported by an Innovation and Experimentation Team in the Treasury Board to provide central support by ensuring the enabling factors are in place to support experimentation; by helping to build capacity; by providing practical tools and resources; and by leveraging existing platforms and reporting structures so that departments can track and share experiences and showcase success.

A further organisational strategy that has been implemented by a number of governments has been to establish internal research support bodies, such as research units and committees. Co-location and control over expertise in-house are likely to increase policy relevance, applicability and timeliness of evidence for decision-making. The availability of in-house research expertise also facilitates opportunities and incentives that may motivate policy-makers to use evidence their work (Haynes et al., 2018[2]).

Some OECD countries have developed dedicated analytical professions to support evidence-informed policy-making. In the UK, a total of 15,000 analysts are based across the government departments. These analysts belong to a number of analytical professions including the Government Economic Service, the Government Statistical Service and the Government Social Research Service. In Ireland, the Irish Government Economic and Evaluation Service (IGEES) operates as an integrated, cross-Government service, supporting better policy formulation and implementation in the civil service through economic analysis and evaluation. The aim of the IGEES is to contribute to the better design and targeting of Government policy and better outcomes for citizens, by building on existing analytical work and playing a lead role in policy analysis. IGEES operates as a cross-government service, with staff embedded in each Department adding their skill set to the varied expertise working on policy analysis and formulation. IGEES supports and builds economic and evaluation capacity and consistency across the civil service (IGEES, 2014[59]).

In the US, the recent Foundations for Evidence-Based Policy-Making Act requires agencies to create three new positions: Chief Evaluation Officer, Chief Statistical Official, and Chief Data Officer. It also requires the creation of a new (or enhancement of an existing) job series in the civil service for program evaluation.

In Chile, a dedicated system of technical support has been created to support better performance management and evaluation Box 4.24.

References

[54] Affairs, P. and A. Institute (eds.) (2016), Guidelines and good practices for evidence-informed policymaking.

[30] Australian Primary Health Care Research Institute (2019), Conversations with APHCRI | Research School of Population Health, https://rsph.anu.edu.au/research/centres-departments/australian-primary-health-care-research-institute/conversations-with-aphcri (accessed on 23 January 2019).

[50] Beuselinck, E. et al. (2018), Institutionalisation of Policy Evaluation as Enabler for Sound Public Governance: towards an OECD Perspective.

[31] Brennan, S. et al. (2016), “Design and formative evaluation of the Policy Liaison Initiative: a long-term knowledge translation strategy to encourage and support the use of Cochrane systematic reviews for informing health policy”, Evidence & Policy: A Journal of Research, Debate and Practice, Vol. 12/1, pp. 25-52, https://doi.org/10.1332/174426415X14291899424526.

[3] Brennan, S. et al. (2017), “Development and validation of SEER (Seeking, Engaging with and Evaluating Research): a measure of policymakers’ capacity to engage with and use research”, Health Research Policy and Systems, Vol. 15/1, p. 1, https://doi.org/10.1186/s12961-016-0162-8.

[10] Campbell, D. et al. (2011), “Evidence Check: knowledge brokering to commission research reviews for policy”, Evidence & Policy: A Journal of Research, Debate and Practice, Vol. 7/1, pp. 97-107, https://doi.org/10.1332/174426411X553034.

[44] Damschroder, L. et al. (2009), “Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science”, Implementation Science, Vol. 4/1, p. 50, https://doi.org/10.1186/1748-5908-4-50.

[47] DEFRA (2010), Defra’s Evidence Investment Strategy and beyond, Department for Environment, Food and Rural Affairs, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/69292/pb13346-eis-100126.pdf (accessed on 4 February 2019).

[48] Department of the Prime Minister and Cabinet (2017), Policy Methods Toolbox, https://dpmc.govt.nz/our-programmes/policy-project/policy-methods-toolbox-0 (accessed on 29 January 2019).

[7] Dobbins, M. et al. (2009), “A randomized controlled trial evaluating the impact of knowledge translation and exchange strategies”, Implementation Science, Vol. 4/1, p. 61, https://doi.org/10.1186/1748-5908-4-61.

[29] Dwan, K., P. McInnes and S. Mazumdar (2015), “Measuring the success of facilitated engagement between knowledge producers and users: a validated scale”, Evidence & Policy: A Journal of Research, Debate and Practice, Vol. 11/2, pp. 239-252, https://doi.org/10.1332/174426414X14165029835102.

[11] EPPI Centre (2016), Department of Health and Social Care Reviews Facility to support national policy development and implementation, https://eppi.ioe.ac.uk/cms/Default.aspx?tabid=73 (accessed on 5 February 2019).

[12] EU Science Hub (2019), Lunchtime science lectures | EU Science Hub, https://ec.europa.eu/jrc/en/lunchtime-science-lectures (accessed on 5 February 2019).

[23] EU Science Hub (2019), Science meets Parliaments in Brussels and across Europe, https://ec.europa.eu/jrc/en/science-meets-parliamentscience-meets-regions (accessed on 22 January 2020).

[55] European Commission (2017), Quality of Public Administration - A Toolbox for Practitioners, https://ec.europa.eu/social/main.jsp?catId=738&langId=en&pubId=8055&type=2&furtherPubs=no (accessed on 7 February 2019).

[57] Government of Canada (2016), Experimentation direction for Deputy Heads - December 2016 - Canada.ca, https://www.canada.ca/en/innovation-hub/services/reports-resources/experimentation-direction-deputy-heads.html (accessed on 22 January 2019).

[56] Government Office for Science (2015), Chief Scientific Advisers and their officials: an introduction, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/426307/15-2-chief-scientific-advisers-and-officials-introduction.pdf (accessed on 28 January 2019).

[42] Hall, J. and G. Van Ryzin (2018), “A Norm of Evidence and Research in Decision-making (NERD): Scale Development, Reliability, and Validity”, Public Administration Review, https://doi.org/10.1111/puar.12995.

[28] Haynes, A. et al. (2012), “Identifying Trustworthy Experts: How Do Policymakers Find and Assess Public Health Researchers Worth Consulting or Collaborating With?”, PLoS ONE, Vol. 7/3, p. e32665, https://doi.org/10.1371/journal.pone.0032665.

[2] Haynes, A. et al. (2018), “What can we learn from interventions that aim to increase policy-makers’ capacity to use research? A realist scoping review”, Health Research Policy and Systems, Vol. 16/1, p. 31, https://doi.org/10.1186/s12961-018-0277-1.

[59] IGEES (2014), Irish Government Economic and Evaluation Service, https://igees.gov.ie/ (accessed on 28 January 2019).

[53] INASP (2018), Evidence-Informed Policy Making (EIPM) Toolkit | INASP, https://www.inasp.info/publications/evidence-informed-policy-making-eipm-toolkit (accessed on 8 February 2019).

[22] INGSA (2019), About – INGSA, https://www.ingsa.org/about/ (accessed on 7 February 2019).

[52] Jacob, S., S. Speer and J. Furubo (2015), “The institutionalization of evaluation matters: Updating the International Atlas of Evaluation 10 years later”, Evaluation, Vol. 21/1, pp. 6-31, https://doi.org/10.1177/1356389014564248.

[13] Kauffeld-Monz, M. and M. Fritsch (2013), “Who Are the Knowledge Brokers in Regional Systems of Innovation? A Multi-Actor Network Analysis”, Regional Studies, Vol. 47/5, pp. 669-685, https://doi.org/10.1080/00343401003713365.

[41] Kothari, A. et al. (2009), “Is research working for you? validating a tool to examine the capacity of health organizations to use research”, Implementation Science, Vol. 4/1, p. 46, https://doi.org/10.1186/1748-5908-4-46.

[32] Kothari, A., S. Sibbald and C. Wathen (2014), “Evaluation of partnerships in a transnational family violence prevention network using an integrated knowledge translation and exchange model: a mixed methods study”, Health Research Policy and Systems, Vol. 12/1, p. 25, https://doi.org/10.1186/1478-4505-12-25.

[6] Langer, L., J. Tripney and D. Gough (2016), The science of using science: researching the use of Research evidence in decision-making..

[40] Makkar, S. et al. (2018), “Organisational capacity and its relationship to research use in six Australian health policy agencies”, https://doi.org/10.1371/journal.pone.0192528.

[39] Makkar, S. et al. (2015), “The development of ORACLe: a measure of an organisation’s capacity to engage in evidence-informed health policy”, Health Research Policy and Systems, Vol. 14/1, p. 4, https://doi.org/10.1186/s12961-015-0069-9.

[5] Makkar, S. et al. (2017), “Preliminary testing of the reliability and feasibility of SAGE: a system to measure and score engagement with and use of research in health policies and programs”, Implementation Science, Vol. 12/1, p. 149, https://doi.org/10.1186/s13012-017-0676-7.

[14] Newman, K., C. Fisher and L. Shaxson (2012), “Stimulating Demand for Research Evidence: What Role for Capacity-building?”, IDS Bulletin, Vol. 43/5, pp. 17-24, https://doi.org/10.1111/j.1759-5436.2012.00358.x.

[9] OECD (2018), Mapping the knowledge broker function across the OECD, OECD, Paris.

[51] OECD (2018), Survey on Policy Evaluation, OECD, Paris, https://one.oecd.org/#/document/GOV/PGC(2017)29/ANN1/REV1/en?_k=uqglaz (accessed on 5 December 2018).

[58] OECD (2018), The Innovation System of the Public Service of Canada, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264307735-en.

[19] OECD (2017), National Schools of Government: Building Civil Service Capacity, OECD Public Governance Reviews, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264268906-en.

[15] OECD (2016), Engaging Public Employees for a High-Performing Civil Service, OECD Public Governance Reviews, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264267190-en.

[17] OECD (2016), The Governance of Inclusive Growth: An Overview of Country Initiatives, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264265189-en.

[20] OECD (2012), Capacity Building Seminar on Regulatory Impact Assessment (RIA) - OECD, http://www.oecd.org/gov/regulatory-policy/riaseminar.htm (accessed on 12 February 2019).

[27] Oliver, K. et al. (2015), “Identifying public health policymakers’ sources of information: comparing survey and network analyses”, The European Journal of Public Health, Vol. 27/suppl_2, p. ckv083, https://doi.org/10.1093/eurpub/ckv083.

[1] Oliver, K. et al. (2014), “A systematic review of barriers to and facilitators of the use of evidence by policymakers”, BMC Health Services Research, Vol. 14/1, https://doi.org/10.1186/1472-6963-14-2.

[43] Parkhurst, J. (2017), The politics of evidence : from evidence-based policy to the good governance of evidence, Routledge, London, http://researchonline.lshtm.ac.uk/3298900/ (accessed on 23 November 2018).

[18] Parra-Cardona, R. et al. (2018), “Strengthening a Culture of Prevention in Low- and Middle-Income Countries: Balancing Scientific Expectations and Contextual Realities”, Prevention Science, pp. 1-11, https://doi.org/10.1007/s11121-018-0935-0.

[49] Peirson, L. et al. (2012), “Building capacity for evidence informed decision making in public health: a case study of organizational change”, BMC Public Health, Vol. 12/1, p. 137, https://doi.org/10.1186/1471-2458-12-137.

[4] R. Makkar, S. et al. (2016), “The development of SAGE: A tool to evaluate how policymakers’ engage with and use research in health policymaking”, Research Evaluation, Vol. 25/3, pp. 315-328, https://doi.org/10.1093/reseval/rvv044.

[34] Robinson, E. (2017), Family Matters - Issue 99 - The Expert Panel Project | Australian Institute of Family Studies, https://aifs.gov.au/publications/family-matters/issue-99/expert-panel-project (accessed on 5 September 2018).

[25] Rushmer, R., D. Hunter and A. Steven (2014), “Using interactive workshops to prompt knowledge exchange: a realist evaluation of a knowledge to action initiative”, Public Health, Vol. 128/6, pp. 552-560, https://doi.org/10.1016/J.PUHE.2014.03.012.

[46] Shaxson, L. (2019), “Uncovering the practices of evidence-informed policy-making”, Public Money & Management, Vol. 39/1, pp. 46-55, https://doi.org/10.1080/09540962.2019.1537705.

[8] Shroff, Z. et al. (2015), “Incorporating research evidence into decision-making processes: researcher and decision-maker perceptions from five low- and middle-income countries”, Health Research Policy and Systems, Vol. 13/1, p. 70, https://doi.org/10.1186/s12961-015-0059-y.

[35] SITRA (2017), Hack for Society - Sitra, https://www.sitra.fi/en/projects/hack-for-society/#what-is-it-about (accessed on 6 February 2019).

[16] SITRA (2017), Public-sector leadership training - Sitra, https://www.sitra.fi/en/projects/public-sector-leadership-training/ (accessed on 6 February 2019).

[26] Stewart, R., L. Langer and Y. Erasmus (2018), “An integrated model for increasing the use of evidence by decision-makers for improved development”, Development Southern Africa, pp. 1-16, https://doi.org/10.1080/0376835X.2018.1543579.

[24] Taylor, R. et al. (2004), “Critical appraisal skills training for health care professionals: a randomized controlled trial [ISRCTN46272378]”, BMC Medical Education, Vol. 4/1, p. 30, https://doi.org/10.1186/1472-6920-4-30.

[38] U.S. Department of Veterans Affairs (2017), Overview - Partnered Evidence-Based Policy Resource Center, https://www.peprec.research.va.gov/PEPRECRESEARCH/overview.asp (accessed on 4 March 2019).

[37] University of Wisconsin-Madison (2019), National Poverty Research Center, https://www.irp.wisc.edu/national-poverty-research-center/ (accessed on 4 March 2019).

[45] Washington, S. and M. Mintrom (2018), “Strengthening policy capability: New Zealand’s Policy Project”, Policy Design and Practice, Vol. 1/1, pp. 30-46, https://doi.org/10.1080/25741292.2018.1425086.

[36] Wehrens, R., M. Bekker and R. Bal (2010), “The construction of evidence-based local health policy through partnerships: Research infrastructure, process, and context in the Rotterdam ‘Healthy in the City’ programme”, Journal of Public Health Policy, Vol. 31/4, pp. 447-460, https://doi.org/10.1057/jphp.2010.33.

[33] What Works Network (2018), The Rise of Experimental Government: Cross-Government Trial Advice Panel Update Report, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/753468/RiseExperimentalGovernment_Cross-GovTrialAdvicePanelUpdateReport.pdf (accessed on 24 January 2019).

[21] Wilsdon, J., M. Saner and P. Gluckman (2018), “INGSA Manifesto for 2030: Science Advice for Global Goals”, INGSA, https://doi.org/10.1057/palcomms.2016.77.

Note

← 1. As with any categorisation exercise, it is recognised that any one intervention could belong in more than one category.

Metadata, Legal and Rights

This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Extracts from publications may be subject to additional disclaimers, which are set out in the complete version of the publication, available at the link provided.

https://doi.org/10.1787/86331250-en

© OECD 2020

The use of this work, whether digital or print, is governed by the Terms and Conditions to be found at http://www.oecd.org/termsandconditions.