4. Strengthening Monitoring and Evaluation in Honduras

Monitoring and evaluation are key functions of the state in all countries. These functions are normally carried out by centre-of-government (CoG) institutions, notably the institution serving the head of government (the presidency, prime minister’s office or cabinet office) and the ministry of finance. Monitoring and evaluation help governments make better decisions, improve policy making, inform citizens about government’s actions, and ensure accountability on the development and implementation of public policies and programmes (OECD, 2016[1]).

Monitoring and evaluation are two complementary but distinct practices, with different dynamics and goals. Monitoring consists in following up on progress in implementing public policies and programmes through systematic data collection through specific indicators. It provides the government, parliament and citizens with information regarding the progress and achievements of ongoing initiatives and/or the use of allocated public resources. Evaluation refers to the structured and objective assessment of the design, implementation and/or results of a future, ongoing or completed initiative (OECD, 2021[2]). Its aim is to analyse the final effects and causes of public interventions, determine the relevance and fulfilment of their objectives, and assess dimensions such as interventions’ efficiency, effectiveness, impact and sustainability.

The monitoring set-up in Honduras is well developed, especially when compared to the country’s evaluation framework. Honduras has been carrying out monitoring activities on the implementation of its national plans (Country Vision 2010-38, Nation Plan 2010-22 and Strategic Government Plan 2018-22) and institutional plans. These activities are mainly led by the Directorate for Results-Based Management (Dirección de Gestión por Resultados, DIGER), and previously by the Secretariat of General Co-ordination of the Government (Secretaría General de Coordinación de Gobierno, SCGG), and supported by the secretariats of state through special monitoring units. As part of the monitoring set-up, Honduras has developed the Presidential System for Results-Based Management (Sistema Presidencial de Gestión por Resultados, SGPR), an IT support tool developed and implemented by the SCGG to collect and save information regarding the follow-up and monitoring of the national and institutional planning. Additionally, the SCGG developed some guidelines and training courses to develop further the competencies of those carrying out monitoring activities, especially on the definition of key performance indicators. However, as also analysed in Chapter 2, the monitoring set-up in Honduras mainly served reporting and accountability goals rather than focusing on supporting the high-level decision-making process. Aiming at overcoming these challenges, in 2022 the incoming government replaced the Presidential System for Results-Based Management by the Public Management System for Results and Transparency (Sistema de Gerencia Pública por Resultados y Transparencia, SIGPRET), administered by DIGER.

In terms of evaluation, despite efforts from the SCGG to develop a culture of evaluation across government, Honduras lacks a sound and robust evaluation system both from a whole-of-government perspective, and for its national and institutional plans. First, there is little awareness of the importance of evaluation and its double objective of promoting public accountability and supporting the learning processes for improving policy outcomes. Second, few evaluations have been conducted by the SCGG or by other government institutions. Nevertheless, SIGPRET is aiming to change this and has produced three evaluations in 2022. Third, Honduras also faces challenges in using evaluation results in policy making, as there is no coherent whole-of-government approach in this area, and there is a lack of appropriate skills and capacities to carry out evaluation. However, the SCGG developed guidelines and training courses to raise awareness on the importance of evaluation and its different approaches and has implemented actions aiming at developing a legal framework for policy evaluation.

This chapter provides an overview of the monitoring and evaluation practices in Honduras, including comparisons with OECD countries’ practices. It provides a description of the institutional framework for monitoring and evaluation, as well as the tools in place for promoting the quality and use of monitoring and evaluation results. It closes with a series of recommendations aimed at helping the Honduran Government to strengthen its monitoring and evaluation culture and promoting the use of evidence and results in decision and policy making.

Having a robust monitoring and evaluation system requires first and foremost the existence of a sound institutional framework for monitoring and evaluation. Such a framework can help countries to co-ordinate isolated and unplanned monitoring and evaluation efforts into more formal and systematic approaches, as well as provide incentives to ensure that these activities are effectively conducted (OECD, 2020[3]).

Although there is no one-size-fits-all approach, a solid institutional framework usually includes the following four components (OECD, 2020[4]):

  • clear and comprehensive definitions of monitoring and evaluation

  • clearly mandated institutional actors with allocated resources to oversee or carry out monitoring and evaluation activities

  • a legal or policy basis to guide and undertake monitoring and evaluation activities

  • macro-level guidance on when and how to carry out monitoring and evaluation activities.

The first component of a sound institutional framework for monitoring and evaluation consists in having clear and comprehensive definitions of those activities, which could be included in legal or policy documents. Such definitions should allow identification of the characteristics of each type of practice and clearly state the objective of carrying out monitoring and evaluation activities. According to OECD data, most OECD countries (23 out of 35) have one or several definition(s) of evaluation (OECD, 2021[2]). In some countries, this definition is embedded in a legal document, while other countries define evaluation in guidelines or manuals (Box 4.1).

In the case of Honduras, there are clear and distinct definitions for monitoring and evaluation, both embedded in government guidelines:

  • The Guide for the Formulation of Indicators defines monitoring as an "independent verification of the progress of a policy, programme or project" (Secretaría Técnica de Planificación y Cooperación Externa, 2012[5]).

  • The Methodological Guide for Design Evaluation defines evaluation as "a systemic process of observation, measurement, analysis and interpretation aimed at understanding an action, in order to reach an evaluative judgment based on evidence in relation to its design, implementation, effects, results and impacts” (Secretaría de Coordinación General de Gobierno, 2017[6]).

In addition to this general definition for evaluation, Honduras has several definitions of specific types of evaluation. Having a general definition for evaluation creates a shared understanding within the public sector of both the objective and features of evaluation, while having specific definitions corresponding to the different types of evaluation carried out throughout the policy cycle allows clarification of the different goals and methods of evaluation (Table 4.1).

Definitions of monitoring and evaluation in Honduras lack clarity on the main objectives of these practices. The definition of monitoring does not provide information on objectives. The OECD identifies the following objectives for monitoring: monitoring is expected to identify delays and bottlenecks in ongoing programmes and policies by providing descriptive information regarding their implementation. It also facilitates planning and operational decision making by providing evidence to measure performance (OECD, 2019[7]). Monitoring can also strengthen accountability and transparency, as it encourages the continuous measurement and publication of information regarding the use of resources, the efficiency of internal processes, and the delivery of outputs and outcomes of a policy or programme (OECD, 2019[7]).

In the case of evaluation, it is important to ultimately recognise the full objectives and potential value of this activity. According to the OECD, evaluation has the potential to improve public accountability and transparency by providing citizens and stakeholders with information on the results of governments’ efforts. Moreover, it can facilitate learning by informing policy makers on the policies and programmes that were, or have the potential to be, successful and the main reasons for their success of failure (OECD, 2018[8]). In this sense, conducting evaluations could allow Honduras to pursue both accountability and learning.

Making the various objectives of monitoring and evaluation clear and communicating these objectives in a legal framework would help create a shared understanding among the main government actors and citizens of the importance and purpose of these activities. Having a clear and comprehensive definition of monitoring and evaluation in Honduras that includes information on the several objectives and advantages of these activities would also facilitate co-operation among the main government actors, both by eliminating any confusion regarding the roles of or differences between monitoring and evaluation, and by making stakeholders aware of the benefits of carrying out these exercises, in particular as they support decision-making processes.

In Honduras, several actors located at the level of centre of government play an important role in co-ordinating and promoting monitoring and evaluation. Different decrees establish the mandates and main responsibilities of these institutional actors in terms of monitoring and evaluation; the following institutions and their mandates correspond to the institutional set-up as of November 2021-:

  • The SCGG had the mandate to define mechanisms and procedures for monitoring and evaluating the government's management results, and to provide recommendations to the President of the Republic to improve the effectiveness and impact of government’s policies and programmes (article 1 of Decree 266-2013). Most of the functions of the SCGG were assumed by DIGER, created in April 2022 by Decree PCM 05-2022.

    • The Presidential Directorate for Monitoring and Evaluation, one of the three directorates of the SCGG, had the mandate to monitor and evaluate the results of national and institutional plans, sustainable development objectives, public policies, programmes, and projects. It was also responsible for proposing and co-ordinating the annual agenda of evaluations of policies, programmes and projects and their corresponding processes (article 8 of Executive Decree PCM-025-2018).

  • The Office of Presidential Priorities, established in 2020 to enhance the delivery of high-level government priorities, was responsible for monitoring strategies, goals, objectives, and action plans of the Presidency of the Republic as well as providing recommendations for the development of and compliance with the strategic priorities and goals of the Presidency of the Republic (article 1 of Executive Decree PCM-044-2020).

  • The sectoral cabinets were responsible for monitoring and evaluating compliance with the objectives and goals of strategic, sectoral, and institutional plans. They were dissolved in 2022. As explained in Chapter 2, sectoral cabinets, which took the form of inter-ministerial committees, were created by Decree PCM-001-2014 with the aim of enhancing government co-ordination under the guidance of the SCGG. Sectoral cabinets also had the mandate to propose and follow up on the impact evaluation of priority sectoral policies and their contribution to the government's long-term objectives (article 9 of Executive Decree PCM-009-2018).

  • The Planning and Evaluation Management Units (UPEGs) within each secretariat of state complemented these actors at the centre of government (Box 4.2). The UPEGs are responsible for monitoring and evaluating the secretariat’s policies, programmes, and projects (article 31 of Decree 146-1986, General Law of Public Administration), and served as the main point of contact between the SCGG and the secretariat of state, particularly for issues related to planning, monitoring and evaluation.

  • Finally, the Secretariat of Finance is responsible for the formulation, co-ordination, execution and evaluation of the General Budget of Revenues and Expenditure (article 45, Decree 83-2004, General Budget Law). It evaluates the execution of the General Budget of Revenues and Expenses both during and at the end of the fiscal year, by using information contained in the Integrated System for Financial Administration (SIAFI). In many OECD countries, secretariats of finance play an important role in promoting the monitoring and evaluation of public policies and programmes by including performance and evaluation evidence in the budget cycle. This is not the case in Honduras, as there is not a strong or clear link among monitoring results, performance management, and budgeting.

Another key component of a sound institutional framework for monitoring and evaluation is the existence of a legal or policy framework to guide and undertake monitoring and evaluation activities. According to OECD data, two-thirds of OECD countries (23 out of 35) have developed a legal framework that guides evaluation, and half of OECD countries (17 out of 35) have developed a policy framework for organising evaluation across government (OECD, 2020[3]). This shows that having a legal basis for carrying out evaluation activities is a key element for the systematisation of these practices across government.

There are several paths for the institutionalisation of monitoring and evaluation practices. The need for evaluation, for instance, can be recognised at the highest level in the country’s constitution, in primary and/or secondary legislation, or it can be developed in a policy framework1 (Box 4.3).

In the case of Honduras, there are general references to monitoring and evaluation in primary and secondary legislation (Box 4.4). However, there are a number of issues preventing the country from promoting the use of results from these activities for decision making and building a culture of monitoring and evaluation across government in the long term, including i) the lack of provisions to ensure the use of performance-monitoring results of priority public policies and national plans in the decision-making process, and ii) the lack of a general long-term framework for monitoring and evaluation.

Regarding the lack of provisions to ensure the use of performance monitoring results of priority public policies and national plans for the decision-making process, it is necessary to start by analysing the different objectives that monitoring activities may have. Monitoring should strengthen reporting, accountability, and transparency, as information regarding the use of resources, internal management processes and outputs of initiatives is routinely measured and systematically publicised. It should also facilitate planning and operational decision making as it provides evidence to measure performance, allows identification of implementation delays, and facilitates drawing lessons from the execution of initiatives.

In the case of Honduras, monitoring is developed around the reporting, accountability and transparency objective. Indeed, there is a legal framework for monitoring the planning system established in Decree 286 of 2009 (Box 4.4), which until November 2021 was co-ordinated from the centre of government by the SCGG and the sectoral cabinets. However, the results derived from the monitoring activities – including the monitoring report of the Nation Plan – were not systematically discussed at meetings where those responsible for leading the decision-making process participate, limiting the impact that monitoring activities could have in Honduras. In this sense, as assessed in the previous chapters, Honduras lacks a framework that ensures that performance monitoring results of priority public policies and national plans are discussed at the decision-making level and analysed around lessons learned, bottlenecks and implementations delays informing the decision-making process.

Regarding the lack of a general long-term framework for monitoring and evaluation, it is necessary to consider recent reforms implemented in Honduras. Indeed, in 2020 Honduras made tangible efforts to institutionalise monitoring and evaluation practices by including specific mandates to make clear to institutional actors when and how to conduct monitoring and evaluation in Legislative Decree 182-2020 (the General Revenue and Expenditure Budget of the fiscal year 2021) (Table 4.2). As a result of that legal framework, during the first semester of 2021, 11 public institutions prepared and delivered design evaluations of one of their strategic programmes.

Although these efforts are in the right direction towards implementing a legal framework for monitoring and evaluation and promoting a culture of evaluation across government, they have important limitations. First, the mandates included in the Legislative Decree ordered the SGCC to send a copy of the reports to a number of representatives across and outside government (e.g., the president, the Superior Court of Accounts, the National Congress, etc.). However, rather than send copies of long and generic monitoring reports to such authorities, it is important to create communication channels and instances with decision makers around priority public policies and key government areas to ensure that performance evidence is used to inform decision making. It is also important to prepare fit-for-purpose monitoring analyses that give users quick and easy access to clear monitoring results that can translate to better uptake of the outcomes in decision making. Second, the mandates are only valid for the short term (the fiscal year), as they are applicable for the current fiscal year only (2021). Third, and related to the second point, the mandates on monitoring and evaluation included in the General Revenue and Expenditure Budget Law are subject to changing political willingness, as renewal of the mandates depends on the political environment and a political consensus in each fiscal year.

The government of Honduras could benefit from integrating the monitoring and evaluation legal framework into the planning system/performance framework, rather than providing monitoring and evaluation mandates through the annual General Revenues and Expenditure Budget law. By setting the monitoring and evaluation legal framework and integrating it into long-term policies such as the planning system and performance framework, Honduras would ensure that there exist clear mandates for institutional actors on when and how to carry out these practices beyond the fiscal year, as well as political consensus on the importance of monitoring and evaluation activities for the country beyond electoral mandates (OECD, 2020[3]). Additionally, this integration would ensure that evidence and results from monitoring and evaluation activities are used not only for reporting and accountability purposes but also as inputs for the decision-making process.

The existence of a legal framework for monitoring and evaluation is not sufficient to sustain a robust monitoring and evaluation system. It is also important to have macro-level guidelines to support the implementation of monitoring and evaluation across government. Such guidelines generally intend to assist all those participating in the implementation of a policy in better planning, commissioning and managing its monitoring and evaluation activities. For instance, guidelines for evaluation mostly refer to the reporting of evaluation results, followed by the identification and design of evaluation approaches, quality standards for evaluations, and use of evaluation evidence (OECD, 2020[3]). Evidence shows that the majority of OECD countries (26 out of 35) have guidelines to support implementation evaluation across government (OECD, 2020[3]).

Honduras has guidelines to assist public institutions in planning, implementing, and managing monitoring and evaluation, including the guidelines for the formulation and approval of public policies, and the methodological guides on design evaluation, impact evaluation and results evaluation. These guidelines are published in the evaluation repository of the SGPR and were communicated to public institutions in specific training courses designed by the SCGG working in tandem with the School for Senior Management in the Public Administration (Escuela de Alta Gerencia Pública). The existence of guidelines and manuals in Honduras shows that there is a general understanding of their importance to assist policy makers in conducting monitoring and evaluation successfully.

However, some essential plans are not included within these guidelines; examples include monitoring of the Country Vision 2010-38, Nation Plan 2010-22 and Strategic Government Plan 2018-22. In the case of Honduras, a robust monitoring and evaluation system may need additional guidelines on monitoring that clearly state the actors involved, their mandates, and the timeline, tools and methodology for monitoring. Guidelines on monitoring should also clarify the articulation of the monitoring activities for the different national plans as well as the institutional plans.

Additionally, detailed manuals on evaluation practices could be developed that involve sectoral stakeholders. Conscious of the limited competencies to carry out evaluation within the secretariats of state, the Presidential Directorate for Monitoring and Evaluation started developing more detailed manuals on evaluation practices. This process could have benefited from comments and suggestions of the secretariats of state, which could have provided insights into the main challenges for and weaknesses of implementing the guidelines in their specific sectors. Encouraging the co-production of these more detailed manuals between the CoG institution responsible for monitoring and evaluation and representatives from the secretariats of state may also be an opportunity to raise awareness about the importance of evaluation and create a sense of ownership across government.

Moreover, article 1 of Executive Decree PCM-025-2018 specified that the Presidential Directorate for Monitoring and Evaluation of the SGCC was expected to evaluate the results of the national and institutional plans, sustainable development objectives, public policies, programmes and projects. However, these are large activities that cannot be totally carried out within a single year. Conducting a proper evaluation requires time and significant resources, and – most importantly – needs to be supported by a clear methodology (OECD, 2021[2]).

To that end, Honduras has already begun to implement a more focused approach to evaluation by selecting one programme to be evaluated each year. Defining a limited number of evaluations to be carried out in a given year is considered a good practice, taking into account that proper evaluation requires a more focused approach as evaluation activities demand time and significant resources. For instance, since 2018 the Presidential Directorate for Monitoring and Evaluation commissioned four external evaluations on specific strategic programmes and projects, including the programmes CONVIVIENDA (With House), Con Chamba Vivís Mejor (With Work you Live Better) and Vida Mejor (Better Life).

However, the CoG institution responsible for monitoring and evaluation could further develop this focused approach by clearly defining and communicating an annual evaluation agenda and developing a specific timeline for evaluations.

Monitoring a policy, programme or project implies identifying indicators that are methodologically robust. For indicators to provide decision makers with information that can be used to define what course of action to take to achieve the intended policy objectives, they should be accompanied by information that allows for their appropriate interpretation (OECD, 2021[2]). Regardless of their typology, all indicators should be presented in a way that provides the following information:

  • description of the indicator: name, unit of measurement, data source and formula

  • responsibility for the indicator: institution, department, or authority responsible for gathering and reporting the data

  • frequency of data collection and update of the indicator

  • baseline that serves as a starting point to measure progress

  • target or expected result.

In the case of Honduras, the country could still improve the indicators of its national and institutional plans, by developing a mix of sound indicators that include process and outcome/impact indicators, allowing both monitoring the implementation of policies/programmes as well as measuring the real effect of the government’s initiatives. Indeed, process indicators and output/outcome indicators are complementary, in the sense that they allow monitoring of different objectives. Process indicators are useful and recommended to track the implementation of the programmes and accountability purposes, since they provide regular flows of information on the implementation of a programme/plan. Output/outcome indicators, meanwhile, are useful to improve high-level decision-making processes by proving information on whether the programme is achieving its intended effects.

At the national level, indicators of Country Vision 2010-38, Nation Plan 2010-22 and the Strategic Government Plan 2018-22 fulfil practically all criteria of a sound indicator, but specific improvements could be considered:

  • The indicators in the Country Vision 2010-38 and Nation Plan 2010-22 are explicitly stated and include information on the data source, the baseline, and the target values for 2013, 2017, 2022 and 2038, as well as on the institution responsible for collecting and updating the indicator. However, indicators could be improved by clearly stating the unit of measurement and formula for their calculation.

  • The indicators in the Strategic Government Plan 2018-22 are explicitly stated, include information on the baseline and the target values for 2018, 2019, 2020, 2021 and 2022. However, indicators could be improved by explicitly stating the institution or person responsible for collecting the data and updating their information, and including the unit of measurement and formula for their calculation.

At the institutional level, secretariats of state struggle to set key performance indicators and mainly use process indicators (Box 4.5). However, Honduras could benefit from having a mix of process indicators, calculated based on the information collected monthly in the SGPR, and outcome/impact indicators, calculated on the basis of administrative data or even ad hoc perception survey data. Additionally, institutional plans with key performance indicators should be public and communicated with key stakeholders, promoting both transparency and accountability.

Additionally, Honduras could benefit from developing a systematic framework to link institutional indicators with national priority goals and the strategic lines of national plans. Developing performance indicators, their baseline and targets is an important stage in the institutional planning and identification of policy priorities (OECD, 2021[2]). Article 1 of Decree 266-2013 established that the SCGG was responsible for defining mechanisms and procedures to monitor and evaluate government's management results. However, there still is not an explicit or systematic framework for the design of monitoring and evaluation indicators.

Indeed, as analysed in Chapter 3, there was a lack of systematic linkage between the national plans (Country Vision 2010-38, Nation Plan 2010-22, and Strategic Government Plan 2018-22) and the institutional plans. This makes it hard for stakeholders to monitor progress in terms of national priority goals and strategic lines, or to understand how institutional plans contribute to strategic plans. While secretariats of state have identified a set of indicators in their own plans, these are not presented in a way that clearly indicates their connection with elements of the Country Vision 2010-38 (national priority goals) and the Nation Plan 2010-22 (strategic lines).

Therefore, explicitly linking each indicator and national priority goals would be essential to clarify the monitoring structure of the national plans. This link could be done visually in the institutional planning documents. The exercise should be undertaken by the CoG institution responsible for monitoring and evaluation, together with the different UPEGs, to inform secretariats of state on the national priority goals and strategic lines they contribute to. Such analysis would also benefit from linking output indicators updated on a regular basis to inform the government on how their administration is performing to outcome-level objectives included in its Country Vision.

A good monitoring and evaluation system relies on comprehensive, multi-source and high-quality data (Box 4.6), that are readily available and in a format easy to be used as part of the evaluation process. Indeed, implementation of an evidence-informed agenda implies leveraging the data that are available for analytical purposes as part of the monitoring and evaluation process (Mathot and Giannini, 2022[10]). Policy evaluation, for instance, can be hindered by the lack of available adequate, easy to use data. In this sense, a high-quality national statistics system and up-to-date databases and registers that mutually communicate and disaggregate data at the desired level are an integral part of a robust monitoring and evaluation system.

In Honduras, there is no integrated data infrastructure to facilitate access or sharing of administrative data horizontally among secretariats of states, a situation that creates data siloes and prevents evaluators from having access to relevant data or information for their own analytical purposes (a challenge highlighted in Chapter 3). As a result, secretariats of state that operate in similar and complementary sectors cannot easily share data and information between them or are not necessarily aware of all the data that exist and could be used in evaluation.

Aware of these limitations, Honduras may consider implementing initiatives to avoid fragmentation and duplication of efforts (e.g., by developing a separate data-sharing infrastructure) across secretariats of state and promoting public sector integration and cohesion. To do so, Honduras may consider starting by carrying out a comprehensive data inventory that accounts for all data assets created and collected by secretariats of state, and developing a strategy to encourage systematic access to, and use of, administrative data. The United States, for example, has institutionalised and implemented a more systematic structural approach to facilitate evidence-informed policy making (Box 4.7).

Honduras moreover has several information systems, including the Integrated Financial Management System (Sistema de Administración Financiera Integrada, SIAFI), the Presidential System for Results-Based Management (SGPR), and the National Public Investment System (Sistema Nacional de Inversión Pública, SNIPH), among others. To ensure that relevant data can be compared and combined across sources to support better-informed decision making and public policies, Honduras could promote interoperability across existing and new information systems within the public sector. Interoperability refers to the ability of different information systems to connect, work together and communicate with one another in a co-ordinated way. By allowing system interoperability, Honduras ensures that information systems can communicate and share data in a more effective way, strengthening the decision and policy making and improving monitoring and evaluation activities. Such a recommendation is aligned with the 2021 OECD Recommendation of the Council on Enhancing Access to and Sharing of Data, which recommends that Adherents “foster where appropriate the findability, accessibility, interoperability and reusability of data across organisations, including within and across the public and private sectors” (OECD, 2021[12]).

Additionally, although the country has a National Statistics System led by the National Institute of Statistics of Honduras, the system has been unable to generate the statistical data that secretariats of state need to carry out programme and policy evaluation. A strong national statistics system is fundamental to improve how the government collects, manages, shares and stores data to make them more useful for evidence-based policy making (Mathot and Giannini, 2022[10]).

The creation of an office for statistical analysis and evaluation of indicators within DIGER, which works with the UPEGs in establishing indicators, represents a step in the right direction. In this sense, Honduras may also consider strengthening its national statistics system by having statistical officers within the UPEGs of the secretariats of state. The role of such statistical officers would be to advise on statistical policy, techniques and procedures throughout the policy cycle and to help guarantee that data needed in the evaluation process are systematically collected from the beginning of an intervention. Statistical officers would also regularly collaborate and consult with the National Institute of Statistics of Honduras to make sure that national statistical data meet the purpose and needs of secretariats of state to conduct evaluation. Considering that the skills of statistical officers are high level and specific, Honduras could also consider developing close co-operation between the National Statistics System and academia to guarantee that the right competencies in the statistical and economic domains are being developed in future graduates.

Quality assurance mechanisms seek to ensure that the findings of an evaluation are based on an objective and defensible interpretation of the results, and relate to the original objectives of the evaluation (HM Treasury, 2020[13]). Quality control mechanisms seek to ensure that the evaluation design, planning and delivery have been properly conducted to meet predetermined quality criteria (OECD, 2021[2]). Most OECD countries (24 out of 35) have in place one or several mechanisms in order to promote the quality of evaluations through various means (OECD, 2020[3]).

In Honduras, the Presidential Directorate for Monitoring and Evaluation implemented actions to promote quality monitoring of reports and evaluations, including by developing methodological guidelines aimed at addressing both the technical quality and the good governance of monitoring and evaluation. These guidelines assist with design, impact, and results evaluation, and include advice on the formulation and approval of public policies.

Considering that quality assurance mechanisms are focused on evaluation practices rather than performance monitoring, Honduras could issue additional guidelines to clarify the working methods and tools that will support monitoring practices across government. These guidelines could also specify quality assurance processes in the context of the monitoring exercise that should be applied by every secretariat of state.

Additionally, Honduras does not have quality control mechanisms for its evaluations, such as peer reviews of the evaluation product, meta-evaluations, self-evaluation tools and checklists, or audits of the evaluation function. OECD data show that quality control mechanisms are much less common than quality assurance mechanisms, with only approximately one-third of countries surveyed using them (OECD, 2020[3]). However, these mechanisms are fundamental to ensure that evaluation reports and evaluative evidence meet a high-quality standard. In this sense, Honduras could develop one or several control mechanisms among the ones presented below.

The most common quality control mechanism used by countries is the peer review process. Peer reviews consist of a panel or reference group, composed of external or internal experts, that subjects an evaluation to an analysis of its technical quality and substantive content (OECD, 2020[3]). The peer review process helps determine whether the evaluation meets adequate quality standards and therefore can be published. In Honduras, the CoG institution responsible for monitoring and evaluation could consider submitting its evaluations for peer review by experts (for instance, academics and international experts) before they are published.

Countries have also developed tools aimed either at the evaluators themselves (i.e., self-evaluation) or at the managing and/or commissioning team (e.g., quality control checklists) to help them verify whether their work meets the appropriate quality criteria (OECD, 2020[3]). Self-evaluation is a critical review of project/programme performance by the operations team in charge of the intervention, as they serve to standardise practices when reviewing evaluation deliverables. Quality control checklists are aimed at standardising quality control practices of evaluation deliverables and as such can be useful to evaluation managers, commissioners, decision makers or other stakeholders to review evaluations against a set of predetermined criteria (Stufflebeam, 2001[14]). The CoG institution responsible for monitoring and evaluation could consider designing such a checklist, to help the secretariats of state and itself control the quality of their work. Examples such as the Polish Ministry of Infrastructure and Development’s self-assessment checklist (Box 4.8) show how self-evaluation checklist initiatives can be implemented to foster the technical quality of evaluations.

Finally, OECD data show that Supreme Audit Institutions (SAI) may also take on an active part in the promotion of evaluation quality (OECD, 2020[16]). SAIs may become key players in the national discourse concerning evaluation quality. Thanks to their particular expertise in performance auditing, they may give governments external insights on how to better manage performance evidence and improve the quality of their evaluation systems. Additionally, Supreme Audit Institutions may sometimes perform evaluations themselves, including on systems for managing information and on policy evaluation systems, employing their own standards for quality. In Honduras, although the Supreme Audit Court reported that it has also started to carry out performance audits to measure the impact of specific programmes – for instance, related to the Sustainable Development Goals – and durante audits on budget execution, its focus continues to be ex post compliance, procurement, and financial audits.

Considering that the Supreme Audit Court is just starting to conduct performance audits itself, Honduras could consider encouraging implementation of such audits that allow analysing the efficiency and effectiveness of key public policies and programmes. To that end, the Supreme Audit Court could consider including in its annual audit plan a minimum amount of performance audits of specific policies or programmes that the government considers strategic.

To put in place a monitoring and evaluation system capable of producing credible and relevant data and analyses, the individuals carrying out these activities must have the appropriate skills, knowledge, experience, and abilities. OECD countries are aware of the crucial role of competencies in promoting quality evaluations: survey data show that 13 out of 35 OECD countries use mechanisms to support the development of competencies of evaluators (OECD, 2020[3]).

In Honduras, the Presidential Directorate for Monitoring and Evaluation implemented actions to promote the competencies of individuals carrying out monitoring and evaluation. The SCGG together with the School for Senior Management in the Public Administration developed online and face-to-face training courses aimed at helping develop the competencies to carry out monitoring and policy evaluation across secretariats of state.

However, Honduras may consider further developing appropriate competencies for monitoring. Monitoring requires having sufficient resources and capacities to collect data on a regular basis, calculate indicators, analyse data, etc., all of which in turn require a critical mass of trained individuals. In Honduras, these resources were located in the UPEG and the Presidential Directorate for Monitoring and Evaluation in the Monitoring Division. Having units dedicated to this function in the secretariats of state is an important step in the mobilisation of resources towards monitoring activities. A further step consists in continuing to strengthen the appropriate competencies for the monitoring units within both the secretariats of state and the CoG institution responsible for monitoring and evaluation; key among these is the ability to define the particular performance indicators that allow collection of relevant data for different policy priorities and policy targets that link performance information across related single- and multi-sector policies.

In terms of evaluation, such activity requires having the relevant technical skills to conduct them. In Honduras, evaluation was mainly carried out by external evaluators from the private sector or academia due to the lack of technical skills and limited personnel within the secretariats of state and the Presidential Directorate for Monitoring and Evaluation in the Evaluation Division. Additionally, there are recurring difficulties in hiring personnel with the appropriate competencies for evaluation, mainly because civil service human resource rules make it difficult and time-consuming to hire specialised staff from outside the civil service.

To strengthen the competencies for monitoring and remedy the lack of skills and personnel to conduct high-quality evaluations, it is necessary to continue implementing mechanisms to support the development of appropriate competencies for both practices. Indeed, several interviewees stressed that Honduras was facing challenges in attracting and developing the appropriate competencies within the SCGG and secretariats of state needed to conduct in-house monitoring and evaluation activities. In order to ensure the technical quality of the results of these activities, the CoG institution responsible for monitoring and evaluation may wish to implement different complementary mechanisms, that include the following:

  • First, Honduras could continue developing and implementing the online and face-to-face training courses that the School for Senior Management in the Public Administration began offering in 2018. These courses have made it possible to train hundreds of individuals across the different secretariats of state; create the basis for conducting in-house monitoring and evaluation; and start building a coherent understanding of the monitoring and evaluation system in Honduras.

  • Second, Honduras may consider developing specific training courses that complement the existing general ones. Indeed, secretariats of state also require specific training courses that allow them to address the particular challenges that arise from the specificity of their sectors. In this sense, the CoG institution responsible for monitoring and evaluation and the School for Senior Management in the Public Administration could create evaluator-training curricula at the level of individual secretariats of state, allowing evaluators to deepen their knowledge of the evaluation of a specific policy topic relevant to their particular sector.

  • Finally, another way to develop competencies of evaluators is to foster knowledge-sharing networks of evaluators. According to OECD data, a frequently used quality assurance mechanism that countries have implemented is the establishment of a network of evaluators for exchanging practical and technical experiences related to evaluation (OECD, 2021[2]), such as the Cross-Government Evaluation Group in the United Kingdom (Box 4.9). The CoG institution responsible for monitoring and evaluation could consider strengthening its role as an evaluation champion in Honduras by creating a network of evaluators that connect people responsible for monitoring and evaluation within the UPEG across the different secretariats of state with academics, the private sector and the international community.

In addition, the CoG institution responsible for monitoring and evaluation could consider developing quality standards for outsourcing and commissioning policy evaluations. Currently, Honduras mainly relies on external evaluators’ competencies to conduct evaluations due to the lack of internal competencies. Considering this and considering that developing the appropriate competencies to conduct in-house evaluations requires time, the CoG institution responsible for monitoring and evaluation could define some quality standards to be included in the terms of reference (ToRs) for outsourcing for and commissioning policy evaluations to external stakeholders. The ToRs provide the guidelines for the work that will have to be carried out during the evaluation process and therefore constitute an essential tool for quality assurance (OECD, 2020[3]). The CoG institution responsible for monitoring and evaluation could also develop additional guidelines to specify that ToRs should be drafted by the evaluation manager (OECD, 2020[3]), and to make sure ToRs appropriately and clearly outline the purpose, objectives, criteria, and key questions for the evaluation.

One of the main goals of monitoring and evaluation is to support decision making with useful insights on public issues and evidence on the impact of policies and their underlying change mechanisms (OECD, 2020[3]). Regardless of their many potential users, the use of monitoring and evaluation results remains a constant challenge and often fails to meet expectations.

As of November 2021, in Honduras, monitoring of national and institutional plans took place in the following scenarios:

  • Monthly, the Planning and Evaluation Management Units of the different secretariats of state reported the progress of the implementation of their institutional plans and the corresponding indicators of the National Plan in the SGPR. This process was supported by the sectoral cabinets, which directly co-ordinated the reporting process of the different institutions of their corresponding sector. However, there were no discussions around policy performance.

  • Twice a year, the Presidential Directorate for Monitoring and Evaluation, with the support of the sectoral cabinets, prepared a monitoring report of the National Plan, based on the information sent by the different institutions. As part of the report, the Presidential Directorate for Monitoring and Evaluation identified the main challenges regarding the implementation process of the Plan as well as those challenges that should be considered to improve management in the immediate future.

However, as highlighted in Chapter 3, there was an absence of connection between the planning system (monitoring for reporting and accountability) and the decision-making process carried out at the centre of government by the president, secretaries of state, heads of public institutions, etc. Although there was a system of monitoring presidential goals (Monitoreo de Metas Presidenciales), it was disconnected from the regular planning system (Country Vision 2010-38 and Nation Plan 2010-22), making it difficult to achieve the use of monitoring results beyond the reporting and accountability purpose in the decision-making process. These challenges need to be considered in the establishment of new planning, monitoring and evaluation structures.

The SCGG communicated the different results of monitoring and evaluation activities both internally and externally by sharing its monitoring and evaluation reports with internal and external stakeholders. Communication of monitoring and evaluation results included the following:

  • According to Legislative Decree 182-2020, the SCGG was expected to publish the report on the progress of the Country Vision 2010-38 and the Nation Plan 2010-22 on its website (since 2012, this report has been prepared and published every two years, not annually as initially established in the Legislative Decree). The SCGG was also expected to submit these reports to the President of the Republic, the Superior Court of Accounts, the Institute for Access to Public Information and the National Congress (through the Ordinary Budget Commission), to directly inform them on the implementation progress of the plans.

  • The SCGG elaborated and published annual reports on the implementation of the corresponding Government Plan.

  • The SCGG was expected to send quarterly reports containing the synthesis results of ex ante evaluations carried out by government departments to the Deputy Coordinators of the sectoral cabinets and the head of each government department.

  • All the reports concerning the monitoring and evaluation of public policies were expected to be published by the SCGG in the Presidential System for Results-Based Management, accessible to the public and policy makers.

However, in order to promote the use of these results, evidence should not only be accessible to the public and policy makers, but also be presented in a strategic way and driven by the monitoring and evaluation’s purposes as well as the needs of intended users (OECD, 2021[2]). Evidence shows that tailored and contextualised syntheses, seminars and advice from knowledge brokers and researchers seem to be the most promising means of improving access to evidence (OECD, 2020[17]).

To tailor evidence and results to different publics, the CoG institution responsible for monitoring and evaluation could consider developing a communication and dissemination strategy to adapt the way monitoring reports and evaluation findings are presented to their potential users (policy makers, civil servants, high-level decision makers, National Congress, citizens, academia, etc.). Such a strategy could include the use of infographics, tailored syntheses of evidence (e.g. in the form of policy briefs or executive summaries), seminars to present the main findings of evaluations, “information nuggets” and fragments of storytelling that can be disseminated through social media accounts, to spread the main messages of key policy and evaluation reports (OECD, 2020[17]). This strategy could also cover recommendations arising from the strategic evaluations conducted by the CoG institution responsible for monitoring and evaluation (presented in the section; macro-level guidance for monitoring and evaluation could be developed further). Indeed, setting a specific methodology to communicate the results of these evaluations could allow informing of the type of formal responses that are expected from public institutions, improve the implementation of the recommendations, and allow follow-up. Countries where such tailored communication and dissemination strategies, which increase access to clearly presented research findings, have been developed include Mexico, New Zealand, and the United Kingdom (Box 4.10).

The use of evaluations is linked to the existence of organisational structures and systems that enable and encourage the production (supply) and use (demand) of evidence. These structures and systems can be found at the level of specific institutions, such as management response mechanisms, or within the wider policy cycle, such as through the incorporation of policy evaluation findings into the budget cycle or discussions of findings at the highest political level (OECD, 2021[2]). Incorporation of evaluation findings in the budgetary cycle is one of the most commonly used mechanisms for promoting the use of evaluations. According to OECD data, 21 OECD countries incorporate findings from evaluations into the budgetary cycle (OECD, 2020[3]).

In Honduras, the National Congress is responsible for approving the General Budget of the Republic, which is prepared every year by the Secretariat of Finance together with government institutions. Every year before 15 September, the Secretariat of Finance sends the budget proposal to the National Congress, where a series of technical discussions between the Budget Office and the Congress Budget Commission take place. Secretariats of state may also participate in these discussions, justifying exceptional changes to their specific budgets. One way of improving the use of evaluation results in Honduras would be to encourage the use of policy evaluations conducted by the centre of government and the different secretariats of state as part of budgetary discussions in Congress to inform budget decisions. For instance, specific policy and programme evaluations could be included as an annex in the main budget document, when relevant.

Additionally, the centre of government could consider systematically holding discussions on evaluation results at the highest political level. The Council of Ministers was already carrying out six-monthly discussions on the progress reports of the Country Vision 2010-38 and the Nation Plan 2010-22. The Council of Ministers’ function could be strengthened if the main findings of the evaluations of strategic programmes and public policies (those prioritised in the evaluation agenda annually set) were also discussed at this stage, together with the budget proposal or progress reports on the national plans.

In Honduras, there are institutions beyond the centre of government that can help convey a strong message related to the importance of evidence-based decision making. Firstly, parliaments have a particular role to play in promoting the use of evaluations. They rely on verifiable and sound data on which they can base their policy initiatives and can thus push for the establishment of a structured approach to gather this information (OECD, 2020[3]). Most parliaments have research and information services that help members of parliament order, understand or request evaluation reports.

Honduras could therefore benefit from supporting and empowering members of Congress in their role as users of evidence as part of budget and general discussions. To do so, Honduras may wish to create a specialised unit within the National Congress aimed at providing technical support to members of the Congress as they analyse and use the results of evaluations carried out by the Executive on their main programmes and policies. Countries with specialised offices within congress/parliament that support the appropriate use of evidence by their members include Canada and the United Kingdom (Box 4.11).

Secondly, the Supreme Audit Court of Honduras could contribute to the use of monitoring and evaluation information and results by assessing government entities’ use of evidence in decision making as part of their mandate to audit the effective and efficient use of public assets and resources. For example, the US Government accountability office produces reports and recommendations targeted to both the executive and Congress on the implementation of the US Government Performance and Results Act, which gives the Office of Management and Budget an important role in disseminating and integrating a results- and performance-based approach to public administration (OECD, 2020[3]).

Moreover, independent institutions responsible for monitoring and evaluating different aspects of the implementation of national plans in Honduras could contribute to the use of monitoring and evaluation results by including assessments of the use of evidence in decision making regarding the definition of the new plans. This includes the National Anticorruption Council, responsible for monitoring the transparent use of public resources allocated for implementation of the Nation Plan, and the National Forum of Convergence, a civil society body responsible for verifying and monitoring the execution of the Country Vision 2010-38 and the Nation Plan 2010-22 using an independent approach (Decree 286 of 2009). These institutions could play a key role in encouraging the government to formulate a new strategic plan based on evidence, lessons learned, and the results of previous evaluations.

This section lists the policy recommendations presented thought the chapter aimed at helping the Honduran Government to strengthen its monitoring and evaluation culture and promoting the use of evidence and results in decision and policy making.

  • Develop and adopt a sound and robust legal framework for the whole of government to guide and undertake monitoring and evaluation activities across government. Such a legal framework could be developed within the broader planning system/performance framework and should include:

    • Clear and comprehensive definitions of monitoring and evaluation, with information on the objectives and advantages of these activities.

    • Clear mandates for specific institutional actors on when and how to conduct monitoring and evaluation activities.

    • Clear mandates for the Secretariat of Finance in the promotion of monitoring and evaluation results as part of budget decision making.

  • Define an annual evaluation agenda, communicate its findings widely and monitor its implementation. In particular:

    • Further develop a focused approach on evaluation by clearly defining and communicating an annual evaluation agenda, which specifies how many and which programmes and policies are going to be evaluated, the evaluators (what competencies they must have, whether the evaluations will be carried out by internal or external stakeholders), and when and how the evaluations should be conducted.

    • Define a specific methodology to communicate the recommendations arising from the evaluations conducted by the centre-of-government institution responsible for monitoring and evaluation, to inform the type of formal responses that are expected from public institutions and to allow follow-up on the implementation of such recommendations.

  • Develop a detailed and tailor-made guidelines and manuals on evaluation practices. In particular:

    • Develop guidance on monitoring that clearly articulates monitoring activities for the different national plans (Country Vision 2010-38, Nation Plan 2010-22 and Strategic Government Plan 2018-22) and the institutional plans, and that clearly states the actors involved and their mandates, the timeline, and the tools and methodology for monitoring.

  • Secretariats of state should improve the quality of indicators used and data produced for monitoring and evaluation. In particular, secretariats of state should:

    • Explicitly link each institutional indicator (included in the institutional plan) to at least one national priority goal and strategy (included in the Nation Plan 2010-22 and the Strategic Government Plan 2018-22, respectively), and clarify the coherence between that institutional plan and the national plans.

    • Strengthen the robustness of the indicators of national and institutional plans by including key background information to facilitate their monitoring and evaluation. Background information should include a description of the indicator (with the formula for its calculations, the unit of measurement and the data source), the body responsible for the collection and reporting of the indicator, the frequency of data collection and update of the indicator, and the baseline and targets.

    • Carry out a comprehensive data inventory that accounts for all data assets secretariats of state created and collected, as a first step towards developing a strategy to encourage systematic access to, and use of, administrative data.

    • Hire statistical officers within the UPEGs, to advise on statistical policy, techniques and procedures throughout the policy cycle and to help guarantee that data needed in the evaluation process are systematically collected from the beginning of an intervention.

  • Further strengthen methodologies and quality control for monitoring and evaluation across government. In particular:

    • Issue additional guidelines to clarify the working methods and tools that will support monitoring practices across government.

    • Develop explicit and systematic quality control mechanisms to ensure that the evaluation design, planning, delivery and reporting are properly conducted and meet predetermined quality criteria such as:

      • Submitting evaluations produced or commissioned by the CoG institution responsible for monitoring and evaluation to peer review by academic or international experts before they are published.

      • Designing self-evaluation checklists to help evaluators from the secretariats of state and the CoG institution responsible for monitoring and evaluation control the quality of their work.

  • Strengthen the role of the Supreme Audit Court of Honduras in the promotion of evaluation quality by including in its annual audit plan a minimum amount of performance audits of specific policies or programmes that the government considers strategic, and by conducting evaluations of the country’s policy evaluation systems.

  • Further build capacity and strengthen competencies for monitoring and evaluation across government agencies. In particular:

    • Strengthen competencies in the monitoring units to develop key performance indicators.

    • Define quality standards to be included in the terms of reference for outsourcing and commissioning policy evaluations to external stakeholders, which allow secretariats of state and the CoG institution responsible for monitoring and evaluation to identify external evaluators with the right competencies for undertaking such evaluations.

    • Develop competencies to conduct in-house evaluations by continuing to offer online and face-to-face training courses together with the School for Senior Management in the Public Administration, and by developing specific training courses at the level of individual secretariats of state.

    • Foster knowledge sharing through a network of evaluators that connect those responsible for monitoring and evaluation within the UPEG across the different secretariats of state with academics, the private sector, and the international community.

  • Promote the use of monitoring and evaluation results and evidence in decision making, and in particular in the budget negotiation process, by for instance encouraging the use of policy evaluations conducted as part of budgetary discussions in the National Congress. In particular:

    • Develop a communication and dissemination strategy to adapt the way evaluation findings are presented to their potential users (policy makers, civil servants, National Congress, citizens, academia, etc.). Such a strategy may include the use of infographics, tailored syntheses of evidence (e.g. in the form of policy briefs or executive summaries), seminars to present the main findings of evaluations, “information nuggets” and fragments of storytelling that can be disseminated through social media accounts, to spread the main messages of key policy and evaluation reports (OECD, 2020[17])

  • Other actors could also contribute to increasing the use of monitoring and evaluation results. In particular:

    • Incorporate evaluation results into the budgetary cycle by informing budget decisions with evidence arising from impact and performance evaluations carried out by the CoG institution responsible for monitoring and evaluation and the secretariats of state.

    • The National Congress could create a specialised unit aiming to provide technical support to members of the Congress to analyse and use the results of evaluations carried out by the CoG institution responsible for monitoring and evaluation and secretariats of state on their main programmes and policies.

    • The Council of Ministers could discuss evaluation results at the highest political level by systematically holding consultations on the main findings of evaluations conducted by the CoG institution responsible for monitoring and evaluation.

References

[9] Congreso Nacional de Honduras (2020), Presupuesto General de Ingresos y Egresos de la República, Ejercicio Fiscal 2021, https://www.tsc.gob.hn/biblioteca/index.php/varios/973-presupuesto-general-de-ingresos-y-egresos-de-la-republica-ejercicio-fiscal-2021 (accessed on 7 October 2021).

[13] HM Treasury (2020), Magenta Book Central Government guidance on evaluation, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/879438/HMT_Magenta_Book.pdf (accessed on 13 October 2021).

[10] Mathot, A. and F. Giannini (2022), “Evaluation Framework and Practices: A comparative analysis of five OECD countries”, OECD Journal on Budgeting, https://doi.org/10.1787/911cc792-en.

[15] OECD (2021), Better Governance, Planning and Services in Local Self-Governments in Poland, OECD Publishing, Paris, https://doi.org/10.1787/550c3ff5-en.

[2] OECD (2021), Monitoring and Evaluating the Strategic Plan of Nuevo León 2015-2030: Using Evidence to Achieve Sustainable Development, OECD Public Governance Reviews, OECD Publishing, Paris, https://doi.org/10.1787/8ba79961-en.

[12] OECD (2021), OECD Recommendation of the Council on Enhancing Access to and Sharing of Data, https://legalinstruments.oecd.org/en/instruments/OECD-LEGAL-0463.

[17] OECD (2020), Building Capacity for Evidence-Informed Policy-Making: Lessons from Country Experiences, OECD Public Governance Reviews, OECD Publishing, Paris, https://doi.org/10.1787/86331250-en.

[16] OECD (2020), How Can Governments Leverage Policy Evaluation to Improve Evidence Informed Policy Making? Highlights from an OECD comparative study“, OECD Publishing, Paris, https://www.oecd.org/gov/policy-evaluation-comparative-study-highlights.pdf (accessed on 7 October 2021).

[3] OECD (2020), Improving Governance with Policy Evaluation: Lessons From Country Experiences, OECD Public Governance Reviews, OECD Publishing, Paris, https://doi.org/10.1787/89b1577d-en.

[4] OECD (2020), Policy Framework on Sound Public Governance: Baseline Features of Governments that Work Well, OECD Publishing, Paris, https://doi.org/10.1787/c03e01b3-en.

[7] OECD (2019), Open Government in Biscay, OECD Public Governance Reviews, OECD Publishing, Paris, https://doi.org/10.1787/e4e1a40c-en.

[8] OECD (2018), Draft Policy Framework on Sound Public Governance, http://www.oecd.org/gov/draft-policy-framework-on-sound-public-governance.pdf (accessed on 7 October 2021).

[18] OECD (2016), Evaluation Systems in Development Co-operation: 2016 Review, OECD Publishing, Paris, https://doi.org/10.1787/9789264262065-en.

[1] OECD (2016), OECD Public Governance Reviews: Peru: Integrated Governance for Inclusive Growth, OECD Public Governance Reviews, OECD Publishing, Paris, https://doi.org/10.1787/9789264265172-en.

[6] Secretaría de Coordinación General de Gobierno (2017), Guía Metodológica: Evaluación de Diseño, https://sgpr.gob.hn/SGPR.Admin2019/Repositorio/Details2/73 (accessed on 5 October 2021).

[5] Secretaría Técnica de Planificación y Cooperación Externa (2012), Guía para la Formulación de Indicadores, https://www.yumpu.com/es/document/read/14342669/guia-para-la-formulacion-de-indicadores-seplan (accessed on 5 October 2021).

[14] Stufflebeam, D. (2001), Method Notes Evaluation Checklists: Practical Tools for Guiding and Judging Evaluations, http://www.wmich.edu/evalctr/checklists/.

[11] UN Global Pulse (2016), Integrating Big Data into the Monitoring and Evaluation of Development Programmes, https://www.unglobalpulse.org/wp-content/uploads/2016/12/integratingbigdataintomedpwebungp-161213223139.pdf (accessed on 7 October 2021).

Note

← 1. A policy framework is a document or set of documents that provide strategic direction and guiding principles to the government on a specific sector or thematic area.

Metadata, Legal and Rights

This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Extracts from publications may be subject to additional disclaimers, which are set out in the complete version of the publication, available at the link provided.

© OECD 2023

The use of this work, whether digital or print, is governed by the Terms and Conditions to be found at https://www.oecd.org/termsandconditions.