Chapter 6. Sweden’s results, evaluation and learning

This chapter considers how Sweden plans and manages for results in line with the Sustainable Development Goals, building evidence of what works, and using this to learn and adapt.

Sweden has adopted a pioneering approach to results-based management, which focuses on achieving long-term, sustainable results through learning and adaptive programming. In order for this approach to be realised, Sweden needs to build staff and partner capacity and strengthen its systems.

Sweden helps to build partner countries’ statistical, evaluation and research capacity, However, it could do more to use partner country results frameworks to monitor and evaluate its country programmes.

Sweden’s evaluation system is in line with OECD DAC principles, and steps are being taken to improve the relevance of its independent strategic evaluation. Decentralised programme evaluations, context analysis and reviews are being used by teams to improve individual programmes and enhance portfolios at the country and thematic level. However, there is scope to strengthen knowledge management systems to ensure this learning is shared more systematically.

    

Management for development results

Peer review indicator: A results-based management system is being applied

Sweden has introduced an innovative approach to results-based management that focuses on achieving long-term, sustainable results and is based on continuous learning and adaptive programming. For this approach to be fully realised, Sweden needs to build staff and partner capacity and strengthen its systems. Sweden strives to demonstrate progress against its strategic objectives even in the absence of a standard approach to results indicators.

Sweden is embracing a pioneering approach to results-based management

Since the last peer review, Sweden has transformed its approach to results-based management with the aim of achieving long-term, sustainable results. Sweden has decided to adopt a learning-based approach that focuses on adaptive programming as opposed to other approaches which can often be skewed more towards demonstrating accountability upwards to funders and overly focused on quantitative measurement of results (Sida, 2018a).

Three key features stand out as part of Sweden’s new approach. First, Sweden emphasises long-term results. Its new aid policy and strategies contain objectives which are based on high-level, long-term outcomes that strongly align to the Sustainable Development Goals (SDGs). Operational plans for the strategies and individual programmes in turn are meant to articulate a theory of change for how these high-level objectives are to be met in a given context.

Second, Sweden has decided against putting in place a standard set of results indicators for demonstrating impact at the corporate, strategy (country/global thematic) or programme level. Strategy teams and partners are allowed to choose the method they think works best, presenting results either qualitatively or quantitatively. Sweden only requires that its staff and partners are able to demonstrate what they want to achieve in relation to strategy objectives, how they intend to achieve this and the progress they are making in delivery.

Third, Sida is focusing on learning and adaption. Inspired by Problem Driven Iterative Adaption and Doing Development Differently1 approaches, Sida is encouraging more adaptive management of programming that uses results information to change programming in real time - on the basis of what is and what is not working (Government Offices of Sweden, 2018).

The new approach offers an opportunity to focus more on impact, but staff capacity building and systems change are required

This new approach offers Sweden the opportunity to focus more on impact and what is driving change in real time, rather than on a static picture of context and a narrow focus on predetermined inputs and outputs. However, the new approach relies on Ministry for Foreign Affairs (MFA) and Sida staff and their partners having the capacity to deliver a solid theory of change, and also to regularly monitor results and changes in the wider context to test whether their assumptions of what drives change stand up. It also requires the systems in place to easily alter programming, if need be, with all the inherent changes to budgeting and staffing that this may require. Sweden is beginning to build capacity in its staff to enable delivery of this approach (Government Offices of Sweden, 2018).2 It has also started to adjust its programme management systems,3 which already had a good degree of flexibility built in, to help staff to focus more on dialogue with partners. Sweden is aware, though, that more needs to be done to ensure this innovative approach genuinely takes root.

Adaptive programming is not yet standard practice throughout the organisation. Sida is piloting new adaptive programme management techniques in the Africa Department to help to explore how best to implement this approach.4 This is a good way to better understand staff need in terms of training and what further programme management system reforms are required. Given the emphasis on learning from results, attention also needs to be placed on ensuring knowledge management systems are upgraded to capture learning beyond the programme team.

Sweden also needs to significantly scale up the support it provides to its partners. While staff are encouraged to fund partner capacity building on results-based management, at present no external guidance exists for partners on implementing this new approach. As is evidenced in Liberia, while many partners welcomed Sweden’s new approach and felt supported by the embassy in delivering it, some also felt they could benefit from greater guidance on what is expected of them.

Sweden monitors and communicates transparently the results of its development co-operation

Sweden’s decision to not use a standard methodological approach for collecting and monitoring results offers teams and partners flexibility to pick the results format that best suits their context. This is particularly welcome in fragile settings and with objectives that aim to achieve long-term behavioural changes. The absence of a standard methodological approach, however, does make aggregating results at the corporate, country and thematic level difficult (OECD, 2017). Despite not being able to aggregate results, Sweden strives to demonstrate progress at the strategic level by focussing on outcome changes in the context and highlighting how individual programmes are contributing to such progress.

Results from programmes are monitored and tracked at the programme level through Sida’s Tool for Results Management and Appraisal of Contributions and at the strategy and thematic level, through annual progress reports. The strategy reporting from Sida to the MFA includes a traffic light rating system to assess overall performance that enables it to monitor progress in meeting each strategy’s high-level objectives, based on an assessment of whether the external context is improving, alongside an assessment of programme delivery.5 Individual programme results are highlighted in narrative form with no attempt made to aggregate results across all contributions. Sida also reports to the public on its activities at the corporate level and the MFA reports to the parliament. Reporting is context-based, mainly qualitative and uses a contribution rather than an attribution approach.6

Sweden helps to build partner countries’ statistical capacity, but struggles to use their results systems for programming

Sweden was among the top ten donors in terms of the volume of ODA provided to support partner countries’ national statistical capacity between 2014 and 2016 (PARIS21, 2018). Its extensive context analysis for assessing performance at the strategy level draws on national statistics. However, as a result of its limited programming with partner country governments and its work in often contentious areas, it struggles in using partner country- led results frameworks. According to preliminary data from the Global Partnership for Effective Development Co-operation, Sweden provided 81.8% of its assessed aid through partner country-led results framework in 2010 and this declined to 65.2% in 2018 (Chapter 5).

Evaluation system

Peer review indicator: The evaluation system is in line with the DAC evaluation principles

Sweden’s evaluation system is in line with the DAC principles and steps are being taken to improve the relevance of its strategic independent evaluations. Sweden struggles, however, to use partner countries’ evaluation systems.

Sweden’s evaluation system adheres to the DAC principles

Sweden’s aid is evaluated by several bodies,7 but the two main entities are Sida’s Evaluation Unit and the Expert Group for Aid Studies (EBA). In line with the OECD’s Development Assistance Committee (DAC) evaluation principles, both of these entities are independent from policy making and delivery; have dedicated budgets and staff (OECD, 2016a); forward-looking evaluation plans; and policies and guidelines that adhere to the DAC principles (OECD, 2016b; Sida, 2018c).

The government established the EBA in 2013, in response to the closure of the Agency for Development Evaluation. The EBA comprises of a committee that is appointed by the government and it is tasked with evaluating and analysing Sweden’s development co-operation - bilateral and multilateral - and with disseminating the findings of its work. The EBA has ten members and a Secretariat comprised of eight staff. It published 12 reports in 2017 (6 evaluations and 6 mappings, overviews and analyses) (Government Offices of Sweden, 2018).

An organisational change strengthened the independence of Sida’s evaluation function in 2018 by creating a separate evaluation unit, reporting directly to the Director-General.8 The Unit has a dual mandate of assisting programme teams with ensuring the quality of the teams’ decentralised evaluations as well as undertaking more strategic independent evaluations.

Steps are being taken to improve the relevance of Sweden’s strategic independent evaluations

The EBA, despite its mandate, has struggled since its inception to deliver independent strategic evaluations, with limited initial output and those evaluations that were produced often lacking relevance and impact. A Statskontoret (2018) analysis of EBA commissioned by the government found that “the EBA’s reports have not had any direct impact on the government’s policy, nor have they affected the way Sida works in any decisive way”. However, Statskontoret also noted that positive developments are underway to address the issue and that the EBA should stay structurally unchanged (Statskontoret, 2018).9

Steps have been taken to improve the EBA’s evaluation performance and output and quality are visibly improving. The government updated the EBA’s directives in 2016, emphasising the EBA’s role in carrying out evaluations (as opposed to analysis more broadly) and increased its budget. These actions have led to an increase in the number of evaluations the EBA has produced in recent years (Government Offices of Sweden, 2018). Actions have also been taken by the EBA to improve the relevance of its evaluations which are beginning to pay off. It has intensified its dialogue with the MFA and Sida to help it to select more strategically pertinent topics. Recipients of reports are also being included in reference groups to help ensure the reports include more practical recommendations. Monitoring that the EBA continues to fulfils its mandate to provide independent evaluations should remain a priority for Sweden, given that the EBA is the only body that evaluates the totality of Sweden’s development co-operation efforts.

Sida’s Evaluation Unit continues to remain heavily focused on assisting Sida’s operational units with decentralised evaluations to ensure their quality, integrity and reliability. The Unit led or supported only three strategic evaluations in 2018, one independent strategic evaluation it commissioned and two commissioned by the thematic departments that the Unit supported. It also co-managed 30 decentralised evaluations.10 While the unit sees a division of labour between itself and the EBA on independent strategic evaluations, Sida should reflect on whether it has an appropriate balance between its own strategic evaluations and its decentralised evaluations, which tend to focus on a single intervention. The unit is also trying to improve the relevance of its strategic and decentralised evaluations by aligning them closer to Sida’s organisational priorities.

Sweden is struggling to use partner country evaluation systems

Sweden scores poorly on the Global Partnership for Effective Development Co-operation’s preliminary data for its 2019 Global Monitoring Report in terms of using partner country governments’ evaluation systems to assess its bilateral country programmes (Chapter 5). While Sweden supports the World Bank’s CLEAR Initiative that aims to help build up partner countries’ monitoring and evaluation capacity, Sweden could do more to use partner countries’ systems to evaluate its country programmes.

Institutional learning

Peer review indicator: Evaluations and appropriate knowledge management systems are used as management tools

Sweden uses decentralised programme evaluations, context analyses and reviews to inform its decision making, and its management responses to evaluations have been strengthened. However, weak knowledge management systems are preventing Sweden from sharing programme and country learning across teams.

Programme evaluations are used to inform decision making and there is a strong focus on building partner countries’ research capacity

Decentralised programme evaluations, context analyses and reviews are being used to improve programme and strategy delivery, as is evidenced in Liberia and with Sweden’s humanitarian work (Chapter 7 and Annex C). The new guidelines for strategies (Government Offices of Sweden, 2017) should strengthen this further with its requirement of an in-depth review of the operationalisation of strategies in the last year of the cycle.

Sida has a strong management response process that has been in place since 1999, with responses mandatory for all centralised and decentralised evaluations. As of 2018, management responses to strategic reviews also are now followed up as part of Sida’s operational planning cycle, ensuring lessons are taken on board across the institution. The MFA has also strengthened its management responses process since the last peer review.11

All EBA reports are made public along with all of Sida-commissioned decentralised and centralised evaluations; further, the EBA actively disseminates its work, holding public seminars to foster debate on the findings. There is scope for Sida to make better use of the findings from decentralised evaluations, as well as the host of context analyses and reviews undertaken by individual teams and embassies, to enable learning across the whole of Sweden’s development co-operation system. Sida’s Evaluation Unit is starting to cluster decentralised evaluations that have a common theme and insert common questions in them, so that findings can be synthesised and learning shared. This is good first step. But given the wealth of information that exists, more thought should be given to how to mine and disseminate this information.

Sweden’s research budget amounted to SEK 920 000 in 2017, and is guided by a research strategy (Government Offices of Sweden, 2015) that focuses on building low-income countries’ own capacity to undertake research and on supporting both international and Swedish research (the latter via the Swedish Research Institute) that is relevant to low-income countries. Sweden has adopted a long-term approach to supporting low-income countries’ capacity. For example, for 40 years, it has supported a partnership between Swedish and Tanzanian researchers; 216 Tanzanians will have earned PhDs by 2020 through this partnership. Sweden also supports African networks like the Consortium for Advanced Research Training in Africa (CARTA), which brings together 16 African universities and research institutes to offer a world-class doctoral education.

Knowledge management remains a challenge for Sweden

The last peer review called on Sweden to improve its knowledge management systems to enhance learning. This remains a challenge, as is recognised by Sweden (Government Offices of Sweden, 2018). This is particularly urgent given the results-based approach adopted by Sweden, which is built on the need to constantly learn and adapt, and the fact that so much of Sweden’s development co-operation is decentralised.

Thematic and functional networks exist in Sida and within the MFA, but information technology and digital technology for disseminating learning remains weak in both institutions. Sida’s new Unit for Learning and Organisational Development, established in 2017, is an encouraging development. The unit aims to foster leadership and a culture based on learning. It also aims to explore how digital tools and systems can be used to improve knowledge sharing, and support more adaptive learning approaches to working, as exemplified in its current support in piloting more agile programming techniques in the Africa Department.

References

Government sources

Government Offices of Sweden (2018), DAC Peer Review 2019 - Memorandum of Sweden, September 2018, Stockholm.

Government Offices of Sweden (2017), Guidelines for Strategies in Swedish Development Cooperation and Humanitarian Assistance, Stockholm, https://www.government.se/48feb3/contentassets/3291aeacc48c495898d5bd59702d9e32/guidelines-for-strategies-in-swedish-development-cooperation-and-humanitarian-assistance.pdf.

Government Offices of Sweden (2015), Strategy for Research Cooperation and Research in Development Cooperation 2015-2021, Annex to Government decision 18 December 2014 (UF2014/80398/UD/USTYR), Ministry for Foreign Affairs, Stockholm, https://www.regeringen.se/49f23e/contentassets/35640f803c554f5abe17800242c5bcb3/strategi-for-forskningssamarbete-pdf-for-webb-eng-2.pdf.

Riksdag (2016), Internationellt 7 Bistånd PROP 2016 /17: 1 U T G I F TS O M R Å D E 7, Förslag Till Statens Budget För 2017[Proposal for the state budget for 2017 international assistance], Stockholm, https://www.riksdagen.se/sv/dokument-lagar/dokument/proposition/budgetproposition-2017-utgiftsomrade-7_H4031d8/html.

Sida (2018a), “Reclaim the results!” (online video), Sida Development Talks, presented 14 February 2018, Stockholm, https://www.sida.se/Svenska/aktuellt-och-press/development-talks/inbjudningar/reclaim-the-results (accessed 29 January 2019).

Sida (2018b), The Year in Review - Sida’s Activities in 2017, Sida, Stockholm, https://www.sida.se/ contentassets/f6334f0d4dd94548bdd6c79c9253a020/the_year_in_review-sidas_activities_in_2017_webb.pdf.

Sida (2018c), Sida’s Evaluation Handbook: Guidelines and Manual for Conducting Evaluations at Sida, Sida, Stockholm, https://www.sida.se/contentassets/7bf0f1bc150c4b92b67722c95a0eecf9/sidas _evaluation_handbook_external.pdf.

Sida (2018d), Strategic Evaluation Plan 2018, Sida, Stockholm, https://www.sida.se/contentassets/ 341c5138cc204fe48550a1853e108552/strategic-evaluation-plan-2018-english-external-use.pdf.

Sida (2018e), Strategy Report for Liberia 2016-2020: Update of the Strategy Implementation and Assessments of Results Since the Latest Strategy Reporting Date Until April 15, 2018, Sida, Stockholm, (internal document).

Statskontoret (2018), “English summary of Review of the Expert Group for Aid Studies (2018:16)”, Swedish Agency for Public Management, Stockholm, http://www.statskontoret.se/In-English/ publications/2018---summaries-of-publications/review-of-the-expert-group-for-aid-studies-201816/.

Statskontoret (2012), Evaluation of Sweden’s International Aid: A Review of Evaluation Activities, Swedish Agency for Public Management, Stockholm.

Other sources

Andrews, M., L. Pritchett and M. Woolcock (2013) “Escaping capability traps through Problem Driven Iterative Adaptation (PDIA)”, World Development, Vol. 51 (C), pp. 234-244.

Center for International Development (2014), “The DDD Manifesto”, Building State Capability blog, CID, Harvard University, https://buildingstatecapability.com/the-ddd-manifesto/ (accessed 29 January 2019).

OECD (2017), “Case studies of results-based management by providers: Sweden, May 2017”, Results in Development Co-operation, OECD Publishing, Paris, https://www.oecd.org/dac/results-development/docs/results-case-study-sweden.pdf.

OECD (2016a), Evaluation Systems in Development Co-operation: 2016 Review, OECD Publishing, Paris, https://doi.org/10.1787/9789264262065-en.

OECD (2016b), “EBA working methods, 16 December 2014”, cited in Annex C, Evaluation Systems in Development Co-operation: 2016 Review, OECD Publishing, Paris, https://doi.org/10.1787/9789264262065-en.

OECD (2013), OECD Development Co-operation Peer Review: Sweden 2013, OECD Publishing, Paris, http://www.oecd.org/dac/peer-reviews/sweden-peer-review-2013.pdf.

OECD/UNDP (2016), Making Development Co-operation More Effective: 2016 Progress Report, OECD Publishing, Paris, https://doi.org/10.1787/9789264266261-en.

PARIS21 (2018), PRESS 2018: Partner Report on Support to Statistics, PARIS21 Secretariat, Paris, http://www.paris21.org/sites/default/files/inline-files/PRESS2018_V3_PRINT_sans%20repres _OK_0.pdf.

Notes

← 1. Problem Driven Iterative Adaptation (PDIA), set out by Andrews, Pritchett and Woolcock (2013), offers a framework for designing development programmes that rests on four principles - local solutions for local problems; pushing problem-driven positive deviance; trying, learning, iterating, adapting; and, scaling up through diffusion. Closely associated with PDIA is Doing Development Differently (DDD), an emerging community of development practitioners and observers who believe that development co-operation can have greater impact if programming is focused on learning and adaptation rather than using pre-planned and fixed programme designs. The DDD Manifesto emerged from a 2014 workshop hosted by the Building State Capability programme at Harvard University’s Center for International Development and the Overseas Development Institute. The Manifesto is available at https://buildingstatecapability.com/the-ddd-manifesto/.

← 2. Some guidance and E-learning training materials are already available to Sida staff to assist them in identifying which components are necessary for a solid theory of change and to support a greater attention to ongoing dialogue with partners to enable programme adaption.

← 3. The new strategy process enables the MFA to regularly assess whether its objectives are appropriate in light of changing contexts and to monitor the progress of Sida’s programmes in delivering on these objectives.

← 4. The Africa Department is testing new adaptative budgeting processes that would enable a far more iterative process of project management.

← 5. These reports include two different ratings. The first is based on the overall development context and the degree to which the strategy’s objectives are being met, with no assessment of Sweden’s contribution or attribution towards these results. The second is based on the degree to which Sweden’s contributions are being successfully implemented; this includes an assessment of portfolio alignment with objectives and the degree to which contributions are on track with delivery, based on partners’ results reporting.

← 6. Sida’s public annual review of its activities uses a similar traffic light format for reporting its progress in delivering on strategy objectives and includes an account of some programme results in each thematic area. For further information, see Sida’s Year in Review 2017 at https://www.sida.se/contentassets/f6334f0d4dd94548bdd6c79c9253a020/the_year_in_review-sidas_activities_in_2017_webb.pdf. The MFA, drawing on Sida’s results reporting and that of its multilateral partners, also provides a results summary in its annual report to the parliament. See also www.riksdagen.se/sv/dokument-lagar/dokument/proposition/budgetproposition-2017-utgiftsomrade-7_H4031d8/html (in Swedish).

← 7. Sweden has its National Audit Office that provides independent audits and performance evaluations of Sweden’s development co-operation and reports directly to Sweden’s parliament. It also has the Agency for Public Management (Statskontoret), a government agency that can provide evaluations of Sweden’s development co-operation on the request of government departments.

← 8. Sida’s Evaluation Unit was prior to 2010 an independent secretariat reporting to the Director-General, however, during 2010 and 2018, it was located within another department and was not considered fully independent, according to DAC criteria.

← 9. The evaluation further noted that this is “often due to the fact that the target groups do not think that the report contributes new knowledge or that it has not come in time to be used in a current decision-making process”.

← 10. In 2018, Sida’s Evaluation Unit undertook four strategic evaluations, and only one of these was part of a proper independent process. That same year, the unit supported 30 decentralised evaluations undertaken by Sida Units that were primarily focus on individual programmes. A further 30 evaluations where undertaken by partners. See https://www.sida.se/contentassets/ 341c5138cc204fe48550a1853e108552/strategic-evaluation-plan-2018-english-external-use.pdf.

← 11. A guidance note for management response was adopted in 2014 with a template that includes relevance of the report, main conclusions and a statement on whether the MFA agrees with the recommendation and proposed action.

End of the section – Back to iLibrary publication page