1. Assessment and recommendations

In 2019, prior to the pandemic, Canada’s federal government invested around CAD 5 billion in its ALMPs to help individuals find work. For the largest funding stream ESDC, Canada’s federal ministry with responsibility for employment insurance and ALMPs, transfers funding to the Provinces and Territories (PTs) through the Labour Market Transfers. These consist of Labour Market Development Agreements (LMDAs) and Workforce Development Agreements (WDAs). PTs are required under the transfers to consult annually with labour market stakeholders to inform labour market priorities and ensure that programming reflects local labour market conditions. Broad ALMP structures are laid down in federal legislation but the delivery of these programmes to individuals and the exact mix of programmes offered is devolved to the PTs. They enjoy flexibility in the mix of programmes delivered and how they are delivered; whether to deliver in-house or to externally contract, and if the latter how to design these contracts. There is a wide range of average costs of ALMPs by PTs, suggesting that this flexibility to tailor is something that happens in practice.

LMDAs provide eligible individuals with programmes such as skills training, recruitment and start-up subsidies, direct job creation and employment support services (including employment assistance services providing lighter touch interventions such as employment counselling, job search assistance and needs assessments). In 2019 these supported some 630 000 individuals to find work. However, despite the size and scale of this funding, spending as a percentage of GDP in Canada is still below the OECD average on both passive and active labour market measures and real spending per unemployed jobseeker in the decade to 2019 was some 22% lower than the decade to 2008. Of the programmes it offers, counselling services and training comprise the majority of the total volume. Relative to other OECD countries Canada has a strong focus on these in the basket of ALMPs it offers. Its job counselling services serve as a gateway to the extra programmes that are offered.

When the LMDAs were established, they contained provisions to continuously evaluate the performance of ALMPs. This ensured that evidence-building was at the heart of policy delivery and facilitated ESDC to build a rich evidence base to support policy making. All but 1 of the 13 PTs opted to conduct these evaluations jointly with ESDC. The evaluations are conducted within cycles and presently ESDC is working on the third cycle of evaluation.1 The first cycle took place between 1998 and 2012 and was completed on a bilateral basis with each of the participating 12 PTs- this limited the number of studies that could be run simultaneously. The second cycle, from 2012-17, augmented this arrangement, so that analysis was conducted simultaneously for all the PTs, allowing conclusions to be developed much more rapidly. The changes made between the first and second cycles (and now onto the third) provide insight into development to both data and processes that Canada has made in its evaluation of ALMPs.

An extensive evaluation of Canada’s ALMPs has already been conducted, demonstrating that these programmes offer value for money to the taxpayer. This evaluation looks at a full suite of outcomes for individuals, including the impacts on income, employment insurance receipt and social assistance receipt. The work has shown that there are significant variations in programme impacts across both individuals and PTs, which may in part be related to the freedom that PTs have to design their programmes.

In Canada, at the federal level, each department is required to establish and maintain a robust, neutral evaluation function. In that context, ESDC has built a proficient evaluation directorate, which produces high-quality impact assessments of its ALMPs, replacing analysis that was previously contracted out. As ESDC demonstrated its ability to effectively implement evaluations in-house to a high standard and good timescales, it has been able to allocate greater resources to invest further in this area. This aligned with a shift around 2016 of the Canadian Government to focus much more on data, and place much more weight on its availability and use in policy evaluation.

The choice over whether to deliver ALMP evaluations in-house or via external contractors is multi-faceted and countries employ different approaches, many opting for some combination. The choice of delivery mode is influenced by decision on the expertise needed to conduct the analysis, the possibility of making data available to external partners, the frequency of evaluation and the management of contractors and analytical narrative. Ministries with little or no analytical functions will be better placed to contract-out research.

There were a number of important elements to the successful shift towards a specialised in-house evaluation team in ESDC. Firstly, there was support and advocacy of the change internally in ESDC from senior leadership. Secondly, wider governmental level shift towards open data and evidence-based policy making provided broader support for the change. Thirdly, the presumption of cost savings generated a lot of goodwill – it had cost around CAD 1 million per annum to deliver the externally-commissioned survey-based analysis. The direct costs of external support were reduced to around CAD 70 000 per annum after the change. Even if this had to mean some internal diversion from previous analytical priorities, the optics of cost reduction can play an important element in guiding risk management and informing resource allocation within governmental organisations.

To support this, a core of analytical expertise with the requisite skills and motivation were built to support the transition. Dedicated resources for the evaluation of ALMPs were put in place. This was combined with a refocusing of other analytical capacity with an explicit mandate to manage the process of co-ordination with the PTs and to bring qualitative evidence to bear on the evaluations (vital to bring contextualise information to the analysis). Analytical expertise could now be devoted separately to data development, methodological work, qualitative research and project management. Crucial to this was the employment of expert external consultants to advise on methodology and outputs, providing credibility to the results and helping to lay down the initial framework for the evaluation and the data requirements.

ESDC’s evaluation directorate is separated from the function that is responsible for ALMP development and implementation. The branch where the evaluation directorate sits has responsibility for the strategic management of data, for programme evaluation and for intergovernmental and international relations. It contains the evaluation directorate, which conducts all of the counterfactual impact evaluation of ALMP, and the Chief Data Officer directorate, which has responsibility for data management, data integration, data access and security in alignment with ESDC’s enterprise Data Strategy.

The evaluation directorate was able to build and grow its internal capacity on methodology and data by reducing its reliance on external experts and data collection efforts. By March 2022, the directorate allocated about ten people to this type of evaluation activity. This has allowed the directorate to become more ambitious in both scope and content of its work. For example, it has now started to conduct gender-based evaluations and is investing significant capacity into machine learning for the third cycle of evaluation, with the aim of even greater sub-group analysis of programme impacts.

By increasingly relying on administrative data and its internal capacity to conduct advanced quantitative analysis, the evaluation directorate has efficiently compartmentalised its resources to increase specialisation. It has three main areas of specialisation relating to its evaluation of LMDA across data preparation, impact analysis and organisation of analysis with PTs. The data preparation team ensures that data provided from CDO are assimilated and organised into the appropriate datasets for evaluation. The impact analysis team then works directly with these data, applying rigorous statistical techniques to estimate programme effects. A separate team manages interactions with PTs, including the organisation and planning of the joint evaluation work and the conduct of any qualitative research that is conducted locally.

The Chief Data Officer directorate was established in 2016, to oversee the department’s data strategy. This has facilitated the establishment of data processes to embed best practices in data architecture and data management, processing, data development and data quality assurance. In line with its vision to drive towards better services and outcomes for clients by treating data as a shared, protected enterprise asset grounded in a culture of stewardship and collaboration, ESDC’s strategy is guiding efforts to embed their data into a cloud infrastructure, streamlining data access protocols and ensuring a common standards for its different data products.

The separation of evaluation and policy within ESDC brings benefits and challenges. Centralising evaluation means that it is easier to co-ordinate evaluations, share expertise and gain through a coalescence of expertise in that area. The challenge is then to ensure priorities are aligned with policy and implementation work and that there is no duplication of work by analysts in other areas.

Extensive communication, organisation and collaboration are needed for ESDC and PTs to jointly plan and conduct the evaluation of the LMDA. Formal governance procedures and honest and accountable leadership have been cited as laying the foundations for this relationship between ESDC and PTs. Governance procedures mean that all parties have a voice in proceedings, formalised through an evaluation steering committee. The Forum of Labour Market Ministers, provides an avenue for ministers from federal, provincial and territorial levels to discuss higher-level issues. Its working groups allow for information sharing and discussion of issues between PTs and federal officials and ensure that any major issues arising from the evaluation work can be discussed further among senior policy makers.

Conducting evaluation of ALMPs requires the availability of high-quality data that are detailed enough to ensure that the estimates made are robust and reliable. Data are needed on what outcomes individuals enjoy subsequent to participation, including information on income and subsequent benefit receipt. Randomised studies, that ensure participants and non-participants are alike in every respect, require less data to estimate impact. Observational studies, where entry to the ALMPs is not controlled, have a higher data burden placed upon them, as these data are used to ensure that similar participants and non-participants are compared. Data can be collected in numerous ways. Administrative data are accurate, cheap, cover the full population but can be sparse in terms of, for example, the characteristics of programme participants beyond age and gender. Survey data offer greater opportunity to tailor data collection but come with added expense to collect, are difficult to collect in the same volume as administrative data and can suffer measurement issues. ESDC has made important improvements to its evaluation work by switching from survey-based data collection to using its administrative data linked to Canada Revenue Agency data on income.

In the first cycle of ALMP evaluation conducted by ESDC, data were collected via the use of surveys at the provincial level. This was primarily done to collect income data for participants, but given the observational nature of the evaluation, it was also necessary to gather detailed socio-economic data to compare alike participants and non-participants – on things like family status, education and previous employment. Samples also had to be drawn up from eligible non-participants to create a comparison group. This was both cumbersome and expensive. The surveys took a long time to plan and complete, entailing the use of external contractors to collect data. The resources required to conduct detailed surveys of individuals to any degree of scale entails a significant investment in personnel that government agencies rarely possess, instead contracting out to specialist external research firms is commonplace. Coupled with the existing analytical capacity within ESDC to manage these contractors at the time, this meant that a maximum of two to three studies could be ongoing at any point. This translated in the first cycle of evaluation taking around ten years to complete. The surveys were also prone to non-response, meaning oversampling was required to ensure sufficient sample size (increasing delivery cost). Recall error from survey participants also impacted the accuracy of the data collected.

Furthermore, once comprehensive comparisons were done between administrative data on earnings for participants and non-participants to the survey data collected from them, it revealed interesting recall errors, which were systematically related to participation. Participants in ALMPs were found to under-estimate their incomes, whilst non-participants were found to over-estimate their incomes. The systematic nature to these recall errors would cause programme impacts to be underestimated, leading to bias conclusions on their effectiveness. The move to administrative data improved accuracy as much as it reduced costs.

ESDC collated separate administrative data sources to establish the Labour Market Program Data Platform (LMPDP), a comprehensive platform for analysis, where data is anonymised to protect personal information. The LMPDP enables ESDC to look at a suite of information relating to ALMPs, including patterns of participation, eligibility for participation, patterns of employment insurance and social assistance receipt, annual sources of income, and annual job patterns. These data are compiled in different stages, taking the underlying data to create a unified dataset that is consistent and allow evaluation to be performed on it. Data are combined to ensure participation in various ALMPs is chronologically consistent. Patterns of eligibility for ALMPs are derived for non-participants, in order to construct a comparison group. Data from Canada Revenue Agency (CRA) are then added on to observe employment outcomes. The integration of the CRA data was vital in enabling ESDC to look at the impacts on employment income, one of the key requirements for any comprehensive ALMP assessment. These data also provided key information on social assistance receipt, which was not obtainable from a centralised register due to their delivery at the PT level. The creation of this platform ensures: efficient use of resource, so that data are not having to be continually re-worked; consistency across different evaluations; and institutional knowledge in the data is built up via their repeated use.

ESDC integrates rich, but protected, information on individuals that is vital in conducting their counterfactual impact assessments. Socio-demographic and historic earnings and benefit data contained in the LMPDP provide a strong basis for which to compare alike individuals that did and did not participate in ALMPs. ESDC makes use of up to 75 socio-demographic and labour market variables, which are observed over five years prior to the participation period. Past patterns of income and benefit receipt are particularly important to serve as proxies for factors that are not captured in administrative data (for example, motivation or ability).

However, there are still some areas for which more data would be beneficial to inform the analysis. Data on education (e.g. field of study) would be particularly useful for young people, for whom there is limited information on their past income or benefit receipt to compare individuals with similar characteristics. Similarly having information on the presence of children would allow better consideration to be given on the labour market participation decisions of parents.

The use of linked administrative data means that outcomes analysis benefits from being accurate and comprehensive, permitting high-quality assessments to be made of how programmes influence participants’ subsequent outcomes. A wide range of key aggregate outcomes are considered in the analysis. CRA data on employment income and on social assistance received in a year, are combined with ESDC employment insurance receipt data. This means that a full account can be given to the main sources of income that an individual might have in the period after participation.

However, due to data limitations, the current analysis does not consider issues of job quality, apart from earnings. Information on employment spells is not recorded in a central register and is not available via the CRA data, so analysis is unable to depict how ALMPs affect job tenure. Attachment to an employer, and whether participation has resulted in the individual finding a better match to job, which is reflected in part by employment duration, is not directly observed. Similarly, the type of contract that individuals are employed on, such as whether it is full-time or part-time, or on a permanent or temporary basis, is not observed in the administrative data. This type of information would help to build information on the impact of ALMPs on job dynamics.

Similarly, timeliness and aggregation of the data on income also impinge on the analysis. At present, income data are lagged two years behind the ALMP participation data and limit the ability for ESDC to produce up-to-date evaluation on their ALMPs. The annualised nature of the CRA income data also raises questions over their suitability to analyse the impact of job counselling programmes, whose benefits are comparatively short and smaller in scale. Assignment of earnings into post- and pre-programme years are non-trivial in such instances where the majority of the impact is likely to accrue in the same year as programme participation. Having more temporally disaggregated data would help to mitigate these issues.

ESDC use rigorous and credible methods to evaluate outcomes of their ALMPs. The ESDC evaluation strategy proceeds as an observational study. It does this because PTs already plan and deliver ALMPs to their citizens. ESDC then use data on this participation in ALMPs and attempt to compare similar participants and non-participants. This type of analysis is essential when policies are either already in operation, or it is not possible to test them prior to full scale roll-out. It means that it is not possible to control entry into programmes and efforts need to be made to find individuals that do not enrol on the programme but who are similar to those that do.

In order to conduct analysis in this manner, ESDC make use of their rich administrative data and employ methods that ensure that participants and non-participants are alike in every observable respect, apart from participation in the programme. The first stage of the analysis is to organise participant and non-participant groups so that the non-participants or participants that are not that similar to others are removed. This is done to reduce the computation and analytical processing time required in the subsequent stage. Participants and non-participants are then “matched” to each other, using the observable characteristics available in the administrative data. This aims to remove differences in outcomes between individuals that would have occurred even without programme participation. A final stage is then performed, so that only changes in outcomes between participants and non-participants are compared. Rather than compare income between participants and non-participants directly, the change in their income from the period before the programme to after the programme is compared. This helps to control for factors that may influence both programme participation and outcome variables but are not available in the administrative data. These methods are credible, respected and widespread in the academic literature. By comparing how the estimates change between the second and final stage of the analysis, ESDC can identify how sensitive the estimates are to the method used, lending greater ability for a discussion around how well the techniques used are able to control for differences between participants.

ESDC assesses packages of support, rather than individual programmes, which mean that these packages contain more than one individual programme. This is not a problem to the ESDC impact assessment itself, but it does make comparison across programmes slightly more difficult. By re-framing the analysis, so that it isolates the impact of a specific programme, it would facilitate comparisons between the different programmes offers.

Administrative data are used to split participants into different groups to analyse effects separately for different types of individuals. ESDC also uses the administrative data to look at other groups of interest, including youth, older workers, long-tenured workers, gender, ethnicity, immigration, and self-identification of being a “visible minority” or having a disability. This then allows ESDC to evaluate whether programmes are more or less effective for different groups of individuals.

The third cycle of evaluation will see ESDC explore the use of machine learning algorithms to look at effect “heterogeneity” – a systematic data-driven way to reveal differences in programme outcomes for individuals (rather than on aggregate) that does not rely on a pre-specified disaggregation by the researcher. Machine learning algorithms also have the potential to automate parts of the process of analysis and remove the need for as much user-expertise in compiling estimators. This will offer an interesting opportunity to test whether significant further variation of participants is found by disaggregating them further.

In addition to the sub-group analysis looking at different personal characteristics, ESDC separates out completely those ALMP participants without a current employment insurance claim. This is done primarily due to the problem of constructing a reliable counterfactual group of non-participants for these former employment insurance claimants using the administrative data. There are concerns that motivation for this group, which is unobservable in the administrative data, might be an influencing factor in participation in ALMPs. This would potentially mean estimates of ALMPs would be biased upwards (if motivation increased participation likelihood and also earnings). ESDC employs a clever technique to address some of the concerns for this group by re-basing the comparison group to be those former employment insurance claimants who participate in job counselling only, the least intensive ALMP offered. By doing this, ESDC is able to compare participants in other programmes to this group, but is able to resolve concerns around motivation for participation, by comparing with individuals who have already volunteered for the other programme. This comes with the expense of not being able to estimate the effect that job counselling itself has on these individuals.

ESDC conducts checks across a wide range of different areas of the analysis to ensure that results are as reliable and credible as possible. Tests are done to ensure estimates are not sensitive to the inclusion or omission of variables. Large changes in estimates if this were the case would suggest the estimates did not give a good general assessment of the programme’s impact on participants. ESDC uses statistical tests to aid selection of the variables it includes in its statistical analysis, reducing the room for human error. Tests are also conducted to ensure that all participants and non-participants selected have an individual that is sufficiently alike to them. There are several different algorithms that can be used to compare individuals to one another and ESDC utilises a range of these algorithms to demonstrate that their central use of algorithm is not significantly different from the others available, lending weight to the stability of the estimates. It also inspects whether differences in outcome variables between participants and non-participants are stable in the pre-programme participation period. This is crucial to the final step of their analysis, where pre- and post-programme outcome differences are compared.

Comparing the results from the regional analysis for PTs to the overall results for Canada, could provide more insight into whether the estimates give a good general assessment of a specific programme. Splitting the whole original dataset randomly into two and using the first half to estimate the statistical parameters before applying these to the second half, would also permit further insight on the ability of the estimates to give a good general assessment of the ALMPs.

Quality assurance is rigorous and high quality external peer reviewers are used to guide analysis. ESDC has implemented a set of corporate procedures on quality assurance to conduct analysis and ensure that methods and outputs are shared with expert external peer reviewers for feedback.

The establishment of an in-house methodology unit has allowed ESDC to develop guidelines on the processes to follow in evaluations. Separating out this process from the team conducting the evaluation allows specialisation – guidance is developed independently and so the risk that the extent and nature of the checks in place is being driven by the analysis that has been conducted is reduced. Guidelines and processes are developed from first principles. Each of the stages developed has multiple checks to complete to ensure data accuracy and methodological rigour, comprising checks on data use, checks on the code to extract data and run analysis, reviews of methodological development for rigour and checks on analytical outputs for consistency.

Also important has been the continued tenure of many of the analytical team, a core of whom have been in post since the beginning, or close to the beginning of the shift towards in-house delivery. This has meant that expertise has been built around them that is cognisant of the gaps, knows the organisational and policy boundaries and is able to plan for change. Staff retention in this respect cannot be underestimated. Investments have been made so that data and methodology are properly documented, to lay the foundations for future analysts and as this will be critical to ensure business continuity with the passage of time as gradually a new generation of ESDC officials take the helm.

EDSC analytical staff have also presented their work to numerous academic and government conference audiences to benefit further from socialisation of techniques and results and the discussion that ensues from this.

Throughout the period of development of in-house expertise, constant engagement with peer reviewers has provided a critical sounding board to develop methodological strategies and ensure ongoing professional development for ESDC analysts, learning practically from experts in the field whilst conducting their work. A feature of this work has been the sustained use of the same peer reviewers providing both continuity and enabling the reviewers to build a deep understanding of the policy landscape and data availability. It also facilitated a deeper understanding of the skillset of the ESDC analytical team for the peer reviewers, so that insights are both rich and nuanced, fully exploiting the abilities of both the reviewers and the in-house team.

The use of peer reviewers has allowed ESDC to conduct analysis in-house whilst maintaining the ability to conduct credible analysis using the most up-to-date techniques. When countries decide on whether to contract-out analysis or to conduct it internally, quality assurance is a key consideration. By contracting-out, it is possible to choose institutions with the specialised expertise necessary to conduct evaluations. If this work is carried-out internally, resources need to be employed directly. ESDC has employed a hybrid model, whereby analysts are employed within ESDC to carry out the technical analytical work, but academic peer reviewers are employed to advise on both techniques, data queries and on reviewing the outputs from the work, including initial advice on the data and underlying quantitative data methodology to utilise. This has allowed ESDC to develop institutional knowledge and expertise on the methods used as the evaluations have progressed.

ESDC take their impact assessments a step further to consider the value for money offered by ALMPs. Often evaluation studies stop short of a full assessment of the worth of a programme by only considering outcomes for the participants. A participant may earn CAD 1 000 more following an intervention, but if it cost CAD 2000 to deliver the programme, it may not make sense to proceed with it. Only looking at the change in earnings precludes the ability to make these assessments. ESDC conducts a full cost-benefit analysis of its ALMPs, with several years of post-programme follow-up. This assessment is only possible because ESDC has already used robust techniques to isolate the impacts on earnings and benefit receipt that are solely attributable to the programme. Not only does this allow federal government and PTs to make planning decisions secure in the knowledge of the return on their investment but by evaluating programmes individually, it allows PTs to make assessments as to the relative mix of programmes they choose to allocate funding. Alongside producing a central cost-benefit estimate, it also varies the assumptions it uses for three key variables: how values in future years are adjusted to provide a present value; changes in the cost to society of using government funds for programming; and changes in the time horizon over which costs and benefits are accrued. This demonstrates how sensitive the results are to these variables. ESDC also details wider costs and benefits that it does not account for in the analysis including benefits to mental and physical well-being, effects on crime, and spillovers to the broader economy.

Extending this cost benefit analysis by including a number of additions to the analysis, would give ESDC the ability to make an even more powerful and comprehensive assessment of its ALMPs. Because of the re-distributive nature of ALMPs, which help those relatively more disadvantaged individuals, sensitivity analysis to weight the outcomes of the policy (by estimating the difference in income between those participating in the ALMPs and the average income of taxpayers and using this to calculate the extra benefit these individuals receive as a result of the income transfer to them) can help to ensure that the benefits of participation are fully accounted for. The value of an additional dollar of income for someone further down the income scale is potentially higher than for someone with a higher income. Providing additional sensitivity analysis that demonstrates how much difference this makes to estimates, ESDC would be better able to position the benefits of its ALMPs relative to other policies which are less redistributive in nature. Full incorporation of data on health outcomes would allow a more holistic view of the potential benefits of work on individuals’ underlying health and the subsequent impact on people themselves and on government outlays. In addition, extensions to existing work looking at the uncertainty around cost-benefit estimates would allow a more refined communication of the plausible range of outcomes.

At present the analysis on ALMPs can only be conducted by the evaluation directorate within ESDC due to privacy and protection of personal information requirements. Whilst resources have been increased over the years to permit more in-house evaluation work, the extent of the analysis conducted is constrained by the limitations of this resource. Increasing data availability to external researchers, would facilitate evaluation of ALMPs, promoting innovation and providing useful cross-reference for the existing work done by ESDC.

Statistics Canada is leading the way in increasing data availability to researchers, offering two different access routes – unrestricted access to carefully de-identified data, or restricted access, which is facilitated at Research Data Centres and offers the possibility of access to a wider range of data. However, even with the unrestricted access it is currently impossible to conduct evaluation of ALMPs, because it does not house data needed on ALMP participation. However, data is held on CRA income and employment insurance receipt, so most of the underlying data used by ESDC is available, meaning that the sole addition of ALMP participation data (already used by ESDC in their evaluations) would permit the conduct of ALMP evaluation.

Many countries have made these data available to researchers and benefit from the expansion of resources this engenders. Some facilitate this using quasi-governmental research bodies (such as the Institut für Arbeitsmarkt in Germany or the Institute for Labour Market Policy Evaluation in Sweden). These bodies help both to share data but also to focus the resources needed to conduct the research within the same institution. Others organise access like Canada, through statistics institutions. Statistics Finland offers a good example of the range of data that might be organised and shared via these institutions. Making small expansions to the data available through Statistics Canada, which builds on the joint evaluation work already conducted by PTs and ESDC, by sharing PTs data on ALMP participation, would make great strides in democratising data access and the ability to open up evaluation to external researchers.

One area which merits further consideration by ESDC is the use of randomised studies to evaluate different aspects of policy delivery. By keeping trials small it would allow policy questions to be evaluated without significantly impacting upon existing delivery. It also allows trials to be managed more easily. As expertise was built, or demand for them increased, they could be scaled up accordingly.

As PTs have significant flexibility in their delivery of ALMPs, it means there is a wide variation in how programmes are delivered, for example, whether they are delivered in-house by provincial government or out-sourced to external contractors. Similarly the precise content of the programmes can vary, for example around the intensity of the job counselling services offered. These issues have not currently been addressed in the evaluation work that has taken place, which concentrates more on the aggregate value-for-money of programmes (what the value for money of training is against no training, rather than say looking at the intensity of training).

Trials are useful because they allow a careful exploration of such policy questions and can be designed to produce evidence on specific policy designs of interest. Randomisation also allows for more robust estimates as it ensures that only effects attributable to the programme are produced. Denmark offers an example where a co-ordinated approach to randomised studies over time has allowed them to systematically address evidence gaps on their policies. By giving thought to precisely how PTs deliver policy, questions on the best structure and delivery mode for ALMPs can be addressed by ESDC and allow policy to be improved beyond broader comparisons of overall programme type (for example, is a more intensive training programme better, rather than just comparing training to counselling services). Trials can also complement the existing evaluation work conducted by ESDC, as they allow both to be done simultaneously. A small trial does not disrupt the ongoing delivery of ALMPs to individuals and can allow innovation to occur without a significant change to delivery in the meantime. Canada has implemented a Future Skills strategy that includes its Future Skills Centre, which already offers the opportunity to independently run such trials on ALMPs that aim to build skills.

ESDC has constantly innovated its evaluation strategy and is now moving towards the vanguard of policy evaluation techniques. Much of this is done iteratively. ESDC and PTs use their evaluation committee to review progress and establish priorities and work streams. Conducting analysis in cycles provides a natural breakpoint for review.

The conduct of evaluation work in ESDC proceeds in an open and transparent fashion. Evaluation reports are published for Canada and all of the participating PTs separately, detailing the impact assessments made for each of the underlying ALMPs. This allows ESDC to tell a positive story about its ALMPs, using the evidence generated on their value for money. The requirement in bi-lateral agreements to cyclically evaluate ALMPs has provided a useful checkpoint that ensures evaluation is carried out routinely, meaning that it cannot succumb to budgetary or political pressures.

The combination of a federal evaluation framework and work within ESDC to foster transparency, means that ALMP evaluation work programmes are well defined, clear and accountable. The federal Policy on Results, which was introduced in 2016, sets out clear obligations for ministries on evaluation conduct and publication. Furthermore, the ESDC evaluation directorate published a paper in 2017 in the Canadian Journal of Program Evaluation that set out the motives and rationale behind the evaluation strategy. In addition, its effort to present its work at external seminars, serve not only as usual peer review but as a means by which to foster this transparency. Jointly, these efforts means that a clear and coherent ALMP evaluation work programme is visible and transparent to the public.

However, adjustments could be made to further improve the reach and understanding of ESDC communication. More could be done to tailor communication to different audiences and ensure that messages are shared in the right format and forums. The Jobeffekter website, of the Danish Agency for Labour Market and Recruitment, offers a good example of how evaluation information can be packaged into different bundles, so that readers of different technical ability can find information that is accessible to them. Jargon could be better avoided and messages focussed so that they are more easily understood and meaningful in communications. Better promotion of analytical work, via social media and news channels, would mean that analysis could reach further. In the United States, MDRC uses newsletters and social media channels to share its work with wider audiences, encouraging greater dissemination. Publication of the peer review summaries, which are collected by ESDC in the course of the evaluation of ALMPs, would help to further promote trust and understanding of the evaluation.

Note

← 1. These cycles refer to the evaluation of the Labour Market Development Agreements, the main federal funding vehicle for ALMP.

Metadata, Legal and Rights

This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Extracts from publications may be subject to additional disclaimers, which are set out in the complete version of the publication, available at the link provided.

© OECD 2022

The use of this work, whether digital or print, is governed by the Terms and Conditions to be found at https://www.oecd.org/termsandconditions.