6. Belgium’s results, evaluation and learning
This chapter considers the extent to which Belgium assesses the results of its development co-operation; uses the findings of evaluations to feed into decision making, accountability and learning; and assists its partner countries and territories to do the same.
The chapter begins with a look at Belgium’s system for managing development results, i.e. whether the objectives of its development co-operation policies and programmes can be measured and assessed from output to impact. It then reviews the alignment of Belgium’s evaluation system with the Development Assistance Committee’s evaluation principles, looking specifically at whether an evaluation policy is in place, whether roles and responsibilities are clear, and whether the process is impartial and independent. Finally, it explores whether there is systematic and transparent dissemination of results, evaluation findings and lessons and whether Belgium learns from both failure and success and communicates what it has achieved and learned.
Belgium is currently reforming its approach to results-based management to highlight the overall contributions of its co-operation to the Sustainable Development Goals (SDGs). This effort to measure the overall results of Belgian co-operation is reflected in: (i) better definition of the results expected in future thematic strategies, in line with the SDGs; (ii) increased results monitoring by implementing partners from civil society and multilateral organisations; and (iii) efforts to synthesise information across the various funding channels. As the Directorate-General for Development Co-operation and Humanitarian Aid (DGD) finalises its new approach, it will need to ensure that the new results monitoring modalities also inform strategic decision making, including in partner countries and territories, and do not constrain adaptive management in fragile contexts.
Following successive co-operation reforms, Belgium has established a comprehensive evaluation system that is based on the evaluation capacities of implementing partners and enables the Office of the Special Evaluator to focus on cross-cutting and strategic assessment. Clarifying the division of labour between this department and Enabel’s internal evaluation function will ensure that Belgium takes full advantage of the complementarity of actors.
Belgium is exemplary in monitoring lessons learned from evaluations; in particular, it has established a mechanism for monitoring the implementation of recommendations up to two years after they are made. As the DGD and Enabel develop new instruments to manage knowledge produced internally and by partners, co-ordination efforts would be improved through greater clarity on the institutional learning strategy, each actor’s role and the type of information needed for the DGD to provide strategic management.
The balance between accountability and strategic management is delicate
To highlight its contribution to the SDGs, the DGD is seeking to set objectives for the entire Belgian co-operation system and to aggregate ex-post data. This approach differs from the strategy for the 2015-20 period, which focused efforts on measuring results at the partner country and territory level (DGD, 2015[1]). In the new approach, the DGD has drawn up a list of standard indicators, taken from international sustainable development indicators, that measure the direct outcomes of a development intervention (OECD, Development Assistance Committee, 2016[2]). These indicators – included in sectoral and thematic strategy notes and the new results frameworks for Enabel country portfolios – should serve both as a compass for Belgian co-operation and as tools for measuring its contribution to the SDGs. This was initially proposed in the research project, “The SDGs as a Compass for the Belgian Development Co-operation” (HIVA-KU Leuven and IOB-UAntwerp, 2020[3]).1
Belgium will have to make sure that the number of standard indicators used by development cooperation actors remains limited. Although a high number of indicators makes it possible to cover the entire portfolio, it creates an administrative burden, internally and for partners, which can be unproductive and even run counter to a results-based approach designed to inform strategic management. Civil society organisations in particular have highlighted the burden and resources needed to provide information on these standard indicators. As most indicators do not document the operational or strategic issues organisations face in programme implementation, and given their limited resources, this burden may reduce the quality of the information provided. In order to maintain a limited number of relevant indicators, the DGD will need first to clarify the anticipated changes in Belgian co-operation, including through a theory of change that highlights the complementarities among its strategic priorities (Chapter 2).
Furthermore, the DGD has not yet highlighted how each implementing partner’s interventions should contribute to shared objectives, either at the level of the overall system or in future country strategies,2 with the exception of NGAs’ contribution to common strategic frameworks (CSF)3 (Chapter 5). Therefore, the recommendation of the peer review on consolidating results information remains valid (OECD, 2015[4]). In this context, ongoing efforts to synthesise lessons, experiences and results across financing channels are crucial (Box 6.1).
As the DGD had little information on the results of the actions it finances, the Directorate-General launched an exercise to take stock of its ten-year co-operation in the health sector in 2019. Enabel and several NGAs participated in this exercise, led by the results department and the social development department. It concluded the following:
Efforts focused on strengthening health systems are aligned with partner country strategies, but policy dialogue with partner countries and territories is poorly documented.
Belgium is making long-term commitments, with coherent and complementary interventions, but the scaling up and sustainability of projects are often uncertain.
Interventions have proved innovative and effective: for example, sector budget support in Rwanda included a performance-based financing component and contributed to improved co-ordination of development co-operation providers. Nevertheless, Belgium is not able to highlight the impact of its co-operation in partner countries and territories because it does not monitor all of its interventions in each sector or country.
This exercise highlighted the strengths and weaknesses of the learning dynamics within Belgium’s co-operation and drew out lessons that cut across financing channels. Given the cumbersome nature of this type of exercise and the results department’s limited resources, the DGD is considering a lighter approach for future cross-cutting analyses with improved integration of its sectoral experts. The Power BI computer platform (see “Institutional learning” section) is supposed to enable the administration to extract non-financial data by sector or country, which should help to streamline future exercises by making it easier to gather information.
Source: DGD (2020[5]), Mémorandum de la Belgique – Examen de l’aide par les pairs Comité d’aide au développement de l’OCDE [OECD DAC Peer Review of Belgium: Memorandum].
Partners are accountable for results
The DGD’s efforts are supported by a results culture shared by Belgian co-operation actors and build on their increased empowerment. For example, the quality of results-based management systems, including for evaluation, forms part of the accreditation criteria for civil society partners and the selection of multilateral partners (Chapters 2 and 5). Civil society actors’ reporting requirements emphasise learning, by only asking for details on areas of underperformance. Likewise, Enabel is contractually responsible for achieving predetermined results at the level of country portfolios, which it must report transparently (Kingdom of Belgium, 2017[6]).
Despite efforts to improve the measurement of development results, the Belgian Investment Company for Developing Countries BIO, provides little information to the DGD on the expected and achieved development results of each investment or the methods for measurement and monitoring. (Chapter 3). By continuing the dialogue initiated with BIO to clarify rationales for each investment and underlying theories of change, the DGD will be better placed to ensure that these investments contribute to Belgian co-operation objectives for sustainable development and avoid undesirable impacts.
The focus on fragile contexts requires adaptive management
Following an evaluation of its results-based management (Enabel, 2018[7]), Enabel is in the process of modifying its approach. Among other things, the evaluation found that the mechanisms focused more on accountability to the results-based management system than on the use of results information for learning and decision making. While the previous theoretical framework was sound, it was limited to measuring results at the project level. In addition, the organisational culture essentially promoted individual learning, and the lack of capacity did not allow for systematic analysis of results-based information for learning or decision-making purposes. Consequently, Enabel is working to streamline reporting and data collection processes and to rebalance results-based management functions to strengthen accountability and the use of information on development results for learning and decision making.
Given its flexibility in implementing its country portfolios (Chapter 4), in theory the agency has the organisational processes for adaptive programming and management. This method of management is critical in fragile contexts, which account for the majority of Belgian direct bilateral co-operation. Nevertheless, the annual requirement to report its contribution to pre-identified expected results and the pressure to demonstrate effectiveness after two or three years of interventions may conflict with the need for non-linear approaches in volatile contexts. A joint reflection with the DGD on the possible adaptation of results-based management in fragile contexts will make it possible to develop country theories of change that allow for flexible and adaptive management methods, while demonstrating realistic medium-term objectives.
As part of the partnership with civil society organisations, the use of theories of change has encouraged greater flexibility for partners to modify programmes based on learning through implementation (OECD, 2020[8]).
Using partner data is a stated intention but limited in practice
The new management contract between the federal government and Enabel clearly states that the agency must first and foremost use the results frameworks and indicators of its partner countries and territories or those used by other donors (Kingdom of Belgium, 2017[6]). Enabel is only called on to only develop specific systems where such indicators do not exist, or where the systems are not relevant or are of insufficient quality. Thus, the results framework of the Burkina Faso country portfolio is directly aligned with that of the National Plan for Economic and Social Development (PNDES), for example.
Despite this, the share of project objectives that are aligned with partner country and territory priorities has decreased considerably since 2016, from 94% to 40%, well below the DAC average (81%), as has the use of government-defined results, statistics and monitoring systems (OECD/UNDP, 2019[9]). This decline is partly because the majority of countries and territories consulted during the second monitoring cycle of the Global Partnership for Effective Development Co-operation did not benefit from interventions implemented by Enabel, but by other Belgian partners.4 The question is, therefore, how to encourage all Belgian co-operation actors, not just Enabel, to use partner country and territory results systems.
More disaggregated data will ensure that no one is left behind
Enabel’s efforts to collect disaggregated data are in keeping with Belgium’s human rights priorities. The agency collects data disaggregated by age and sex, reflecting the priority given to women and children (Chapter 2). As Belgium is seeking to align its co-operation as much as possible with the 2030 Agenda, disaggregating these data further – while making maximum use of data from partner countries and territories – would help it to better implement the principle of leaving no one behind by ensuring that the most disadvantaged are indeed the beneficiaries of Belgian co-operation.
The evaluation system strives to enhance actors’ capacity
Following successive co-operation reforms, Belgium has established a comprehensive evaluation system that builds on the evaluation capacities of co-operation actors. The Office of the Special Evaluator of the FPS Foreign Affairs, Foreign Trade and Development Co-operation has thus established a certification and evaluation process to assess the monitoring and evaluation systems of Belgian government and non-government actors who are now responsible for the mid-term and final evaluations of their projects.5 This mechanism for ensuring the quality of evaluations allows the Office of the Special Evaluator to focus on transversal evaluations, and to base its analyses on previous evaluations.6 This approach also ensures a continuum of evaluations, covering projects, instruments, themes, sectors and countries.
The ongoing review of the evaluation policy of the FPS Foreign Affairs, Foreign Trade and Development Co-operation and of Enabel’s strategic evaluation plans may provide an opportunity to formalise the new division of roles between these two institutions. The current evaluation policy (SES, 2014[10]) predates the NGA accreditation process and the creation of Enabel’s internal evaluation function. This new function involves conducting internal systems evaluations and cross-cutting strategic evaluations. Thus, there is a risk of confusion as to who is responsible for country and sector evaluations: Enabel is responsible for evaluating the country portfolios that it implements, but the Office of the Special Evaluator is the only entity with the mandate to evaluate all the delivery channels and development co-operation actors of Belgian co-operation that operate in one country.7 Formalising the division of roles between the Office of the Special Evaluator and Enabel would make it possible to enhance the complementarity between evaluations, reduce the risk of duplication and avoid breaking the continuum of evaluations.
A solid institutional structure ensures that evaluations are independent
Evaluations are independent thanks to the institutional positioning of the Office of the Special Evaluator, its budgetary and planning autonomy, and the control mechanisms put in place for each evaluation.8 This office, led by a special evaluator who is selected by an independent jury, reports directly to the management committee of the FPS Foreign Affairs, Foreign Trade and Development Co-operation and reports annually to parliament. It has its own budget, which allows it to develop an evaluation programme over two or three years that can cover all Belgium’s official development assistance, after consulting on stakeholder needs.
Enabel has a similar system in place, with the new internal evaluation department reporting directly to the board of directors. It is important that the future department has programming independence and its own budget to conduct evaluations that meet the agency’s knowledge needs and that complement those of the Office of the Special Evaluator.
Partnerships strengthen partner countries and territories’ evaluation capacities
Belgium supports evaluation partnerships. According to monitoring carried out by the Global Partnership for Effective Development Cooperation, partner governments are involved in 75% of final project and programme evaluations, compared with a DAC average of 48% (OECD/UNDP, 2019[9]). Enabel has a tool to assess the evaluation capacities of partner countries and territories. Although it has not been used when designing the latest country portfolios, management now advocates the systematic use of a simplified version of the tool for future portfolios as part of the new approach to results-based management. As the strategic evaluations conducted by the Office of the Special Evaluator cut across the whole Belgian co-operation system, partners are more rarely involved in the steering committees.
Given its limited financial and human resources, the Office of the Special Evaluator positions itself as a facilitator of, rather than a major donor to, evaluation capacity building. Within this framework, it awards grants to experts from public institutions, civil society, donor agencies, academia and the private sector in partner countries and territories to support their participation in seminars. It also participates as a trainer in certain training programmes organised by its partners (SES, 2019[11]), (SES, 2018[12]), (SES, 2017[13]). By requesting applications from mixed evaluation teams, i.e. which also include evaluators based locally, these evaluations indirectly help to strengthen evaluation capacities in these countries and territories.
The monitoring of recommendations is exemplary
The FPS Foreign Affairs, Foreign Trade and Development Co-operation is exemplary in monitoring the lessons learned from evaluations. Each evaluation conducted by the Office of the Special Evaluator is subject to a management response that is drafted by all evaluation stakeholders, including BIO and Enabel where relevant, and published online. As recommended in the previous peer review (OECD, 2015[4]), the service monitors the implementation of recommendations up to two years after the study, with the help of the DGD results department when it comes to FPS commitments.9 While recommendations are generally implemented, it is not always possible to fully synchronise evaluations with strategic decision making due to limited human resources within the Office of the Special Evaluator. For example, the evaluation of the Belgian approach to private sector strengthening was only finalised after a new government policy on the issue was developed.
Establishing learning priorities would help to structure knowledge management
Both the DGD and Enabel position themselves as centres of knowledge and have launched numerous initiatives to improve institutional learning. A review of these initiatives and in-depth reflection on the division of roles would avoid duplication of efforts, strengthen the complementarity of approaches, and enable the DGD to position itself as a strategic rather than a sectoral centre of knowledge.
The DGD aims to be a centre of knowledge and a driver of Belgian co-operation. It does so by using numerous instruments to extract and centralise the knowledge generated by its partners, but these are not always co-ordinated (OECD, 2015[4]). In particular, Belgium has embarked on an IT project using Power BI software to extract monitoring sheets from a shared database. These sheets are arranged by theme, sector, country and actor, and go beyond financial information, providing a cross-cutting view of financing channels. This approach is useful for synthesising information and facilitating its dissemination, as the DGD relies entirely on its partners to implement its co-operation programme. This project is complemented by the production of sectoral knowledge documents (Box 6.1), lessons learned by NGAs and the mobilisation of policy supporting research.10 However, these various initiatives are rather ad hoc, and do not specify what the knowledge generated would be used for or by whom, either within and outside the administration. A clearer medium-term institutional learning strategy would enable the DGD to better structure these efforts around priorities so that the knowledge generated can be put to good use.
Enabel is also seeking to position itself as a centre of knowledge and policy support. It has put knowledge management at the heart of the new management strategy based on three levels: operational, results and learning. At present, a significant proportion of knowledge management involves direct exchanges among staff. For example, workshops at headquarters that bring together staff posted in partner countries and territories to discuss particular themes, bi-annual meetings between local representatives, and exchanges between country representatives and experts at headquarters facilitate knowledge sharing. Belgium’s strong geographic concentration on two regions, the Sahel and the Great Lakes, offers exchange opportunities that are not yet fully exploited in the agency’s learning dynamics, however. In order to formalise knowledge management, Enabel is also developing new tools, such as standard indicators (see previous section), “key solutions” relating to the broad issues the agency has identified, and funding plans that are tailored to country interventions. As these are still in development, this peer review could not analyse their effectiveness.
References
[5] DGD (2020), “Mémorandum de la Belgique - Examen de l’aide par les pairs Comité d’Aide au Développement de l’OCDE”, [DAC Peer Review of Belgium: Memorandum], Working document, FPS Foreign Affairs, Foreign Trade and Development Cooperation, Brussels.
[1] DGD (2015), “Note stratégique Résultats de développement”, FPS Foreign Affairs, Foreign Trade and Development Cooperation, Brussels, https://diplomatie.belgium.be/sites/default/files/downloads/Note-strategique-Resultats-de-Developpement.pdf.
[7] Enabel (2018), “Les résultats dans le cycle de gestion” [in French], Belgian Development Agency, Brussels, https://www.enabel.be/sites/default/files/eval_gestion-axee-resultats_rapport.pdf.
[3] HIVA-KU Leuven and IOB-UAntwerp (2020), “The SDGs as a Compass for the Belgian Development Co-operation - Final report”, HIVA-KU Leuven, Leuven, http://repository.uantwerpen.be › docstore › d:irua:551.
[6] Kingdom of Belgium (2017), Arrêté royal portant approbation du premier contrat de gestion entre l’Etat fédéral et la société anonyme de droit public à finalité sociale Enabel, Agence belge de développement [in French], http://www.ejustice.just.fgov.be/mopdf/2017/12/22_1.pdf#Page19.
[8] OECD (2020), Development Assistance Committee Members and Civil Society, The Development Dimension, OECD Publishing, Paris, https://dx.doi.org/10.1787/51eb6df1-en.
[4] OECD (2015), OECD Development Co-operation Peer Reviews: Belgium 2015, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264239906-en.
[2] OECD, Development Assistance Committee (2016), “Discussion note : Development Co-operation Under the 2030 Agenda: Results and Relevance”, OECD, Paris, https://www.oecd.org/dac/results-development/docs/Development_co-operation_under_2030_Agenda.pdf.
[9] OECD/UNDP (2019), Making Development Co-operation More Effective: 2019 Progress Report, OECD Publishing, Paris, https://dx.doi.org/10.1787/26f2638f-en.
[11] SES (2019), “Rapport annuel de l’Évaluatrice spéciale de la Coopération belge au développement Mars 2019 : Dépasser les clivages Nord-Sud”, FPS Foreign Affairs, Foreign Trade and Development Cooperation, Brussels, https://diplomatie.belgium.be/sites/default/files/downloads/rapport_annuel_2019_depasser_les_clivages_nord-sud.pdf.
[12] SES (2018), “Rapport annuel de l’Évaluatrice spéciale de la Coopération belge au d éveloppement Mars 2018 - De l’évaluation de l’aide vers l’évaluation du développement”, FPS Foreign Affairs, Foreign Trade and Development Cooperation, Brussels, https://diplomatie.belgium.be/sites/default/files/rapport_annuel_evaluatrice_speciale_cooperation_2018.pdf.
[13] SES (2017), “Rapport annuel de l’Evaluatrice spéciale de la Coopération belge au Développement Mars 2017 - Évaluer l’aide au développement : hier et demain”, FPS Foreign Affairs, Foreign Trade and Development Cooperation, Brussels, https://diplomatie.belgium.be/sites/default/files/downloads/rapport_annuel_de_levaluateur_special_2017.pdf.
[14] SES (2016), “Évaluation de l’appui aux politiques par les acteurs institutionnels” [in French], FPS Foreign Affairs, Foreign Trade and Development Cooperation, Brussels, https://diplomatie.belgium.be/sites/default/files/downloads/evaluation_de_lappui_aux_politiques_par_les_acteurs_institutionnels.pdf.
[10] SES (2014), “Politique d’évaluation : Tirer des enseignements de l’expérience passée et rendre compte des résultats”, FPS Foreign Affairs, Foreign Trade and Development Cooperation, Brussels, https://diplomatie.belgium.be/sites/default/files/downloads/politique_evaluation_SES_v2.pdf.
Notes
← 1. It is also planned to develop a series of specific indicators for non-government actors (NGAs) once the research work is finalised.
← 2. The DGD has not yet adopted country strategies that include all activities financed by Belgium, regardless of the financing channel, but the process is in a pilot phase in some partner countries.
← 3. Following the introduction of common strategic frameworks for NGAs (Chapter 5), NGOs should report not only on the results of their interventions but also on their contributions to the outcomes of the relevant strategic framework.
← 4. Whereas in 2016, 12 of the 13 participants were partner countries and territories.
← 5. The certification process assesses the organisation’s evaluation capacity, the evaluation processes it uses and the resulting evaluation reports. The use of lessons learned is also analysed.
← 6. Over the 2018-19 period, the Office of the Special Evaluator assessed, among other things, Belgian development co-operation support to the private sector, development education activities, the impact of Belgian university development co-operation, inclusive and sustainable entrepreneurship in the agricultural sector in Benin, Belgian exit strategies from six countries of direct bilateral co-operation, as well as the Belgian Fund for Food Security, the integration of the food security theme, and the multi-actor approach in the framework of Belgian development co-operation.
← 7. The Office of the Special Evaluator has the mandate to evaluate any federal government activity that is counted as ODA.
← 8. For each evaluation, the service also ensures that there are no conflicts of interest in choosing the external evaluation team and arbitrates on the comments made by members of the evaluation steering committee.
← 9. This monitoring of recommendations is published in the Special Evaluator’s annual report.
← 10. However, an evaluation of the latter tool (SES, 2016[14]) noted that it was more often used to strengthen strategies than to develop new policies, and highlighted the risk that as its human resources decline, the DGD may no longer be able to make relevant requests and use research results.