Chapter 4. Impact of adult learning

For job-related adult education and training to have a positive impact on labour market outcomes for individuals, firms and societies, it is imperative that the training provided is of high quality and relates closely to skills needed by employers. It is also necessary that good information on the quality and outcomes of training programmes and providers is available to help people make informed decision on investment in adult learning. In addition, an enabling environment at the workplace is essential to put acquired skills to good use. This chapter provides evidence on the perceived impact of participation in adult education and training, and looks at how evaluation and quality assurance is regulated and how information on training quality is shared with the wider public. It also provides examples of how firms can foster the best use of their employees’ skills.

    

The statistical data for Israel are supplied by and under the responsibility of the relevant Israeli authorities. The use of such data by the OECD is without prejudice to the status of the Golan Heights, East Jerusalem and Israeli settlements in the West Bank under the terms of international law.

4.1. Ensuring that participation in adult learning has the desired impact

As discussed in the previous chapters, a range of policies, initiatives and incentives are in place in OECD countries to encourage participation in adult learning activities. In many countries, particular efforts are made to ensure access to training for underrepresented groups, and to bring adult learning provision in line with labour market needs. These efforts contribute to ensuring that training has a positive impact, by making sure that adults who need training will develop the right skills. However, increasing participation and aligning provision to the needs of the labour market are unlikely to have the desired impact on skills development if the training provision itself is of low quality. Further, it is important that information about the quality of adult learning provision is communicated widely, such that prospective participants can make informed choices. As argued by OECD (2005[1]) poor-quality learning programmes and a lack of awareness of programme outcomes can contribute to under-investment and low participation in adult learning. Although quality assurance is essential it also faces several challenges: in most countries the number of providers is extremely large and likely to increase as the demand for adult learning rises. Furthermore, the trend towards more flexible adult learning provision, for example through e-learning, poses new challenges for quality assurance.

Clear and well-defined quality assurance systems generally exist for formal education, but much less so for non-formal education and training (Broek and Buiskool, 2013[2]), where there is a wide and diverse range of providers (see Figure 4.1). Employers are the main providers of non-formal adult learning (31% of participants), followed by non-formal education and training institutions (20%). At the employer level, training can take place internally, or can be delivered by external providers. The majority of enterprises draw on external organisations to provide training, most of which are private training organisations.

According to Broek and Buiskool (2013[2]), countries that have well-established quality systems in place for formal and non-formal adult learning are generally also the ones that have higher participation in adult learning. While there is a general consensus that investing in quality assurance mechanisms is worthwhile, there is a lack of empirical evidence to support the argument. In general, evaluations of quality systems in (non-formal) adult learning are scarce.

Not only the quality of skills development is important for training to have an impact on labour market outcomes, but also the extent to which newly acquired skills are used in the workplace. As shown by the OECD (2016[3]), the presence of High-Performance Work Practices (HPWP) in the workplace is associated with increased skills use. Employers can foster more intensive skill use through incentive systems, like bonus payments and flexible working hours, and governments and the social partners can support the implementation of these types of practices.

Figure 4.1. Providers of non-formal adult education and training
picture

Note: Average of European OECD countries participating in the respective surveys. Data for the top panel refer to job-related and non-job-related non-formal learning. Data for the bottom panels only includes enterprises with at least ten employees.

Source: Panel A: AES (2016), Panel B and C: CVTS (2015).

4.2. Impact of adult learning - results from the PAL dashboard

The impact of participation in adult learning is a multi-dimensional concept that can be measured in many different ways and is often difficult to observe directly. Therefore, internationally comparable information is scarce. The PAL dashboard focusses on the perceived impact of training by looking at: self-reported satisfaction, skill use and labour market outcomes, and the wage returns of training participation (see Table 4.1). While these dimensions reflect important aspects of the impact of adult education and training, they do not provide a full picture. More internationally comparable and objective data are needed to draw a fuller picture of how the impact of adult education and training differs between countries.

Table 4.1. Perceived impact of adult learning – PAL indicators

Perceived impact

Usefulness and effectiveness

Usefulness of training

% participants for whom at least one training activity was “very useful” for their job

Use of acquired skills

% of participants currently using or expecting to use the acquired skills

Impact on employment outcomes

% of participants for whom the acquired skills helped achieve positive employment outcomes

Wage returns

Wage returns to formal or non-formal adult learning

Note: See Annex B for details on the data sources used for each indicator.

According to the PAL dashboard, the countries that perform best across the different dimensions of perceived impact of adult learning are Chile, Hungary, Latvia and Portugal (Figure 4.2). The lowest overall scores are recorded for the Israel, Japan and the Netherlands The following subsections describe the results in more detail.

Figure 4.2. Results of the Perceived Impact dimension
Perceived Impact index (0-1)
picture

Note: The index ranges between 0 (lowest perceived impact) and 1 (highest perceived impact).

Source: See Annex B and C for details on data sources and methodology.

The usefulness of training can be measured both in terms of perceived usefulness among participants and actual use of acquired skills. While the former reflects a personal judgement of the training content, the latter refers to the extent to which the skills are being used or expected to be used in practice. On average across OECD countries participating in the PIAAC survey, 52% of adults found their formal or non-formal training activity very useful for the job they had at the time of the learning activity. This average hides large differences in satisfaction levels between OECD countries (see Figure 4.3), with levels ranging from 24% of participants in Japan to 82% in Denmark.

Similarly, countries differ in the extent to which participants actually use or expect to use the skills they acquired in their training activity, although these differences are much smaller than for self-reported usefulness (Figure 4.3).1 In the Czech Republic, Italy, Latvia, Lithuania, Poland and Turkey more than 90% of participants use or expect to use their newly acquired skills, whereas this is only the case for 73% of participants in the Netherlands. Interestingly, the indicators on self-perceived usefulness of training and actual use of acquired skills are only weakly correlated (and even negatively). Lithuania, for example, combines a very high share of people using or expecting to use their acquired skill (96%) with a relatively low share of participants finding the training very useful (42%). The weak correlation between the two indicators may be explained by the fact that usefulness only refers to how useful training was for the job held at the time of participation, whereas the use of skills refers more broadly to whether or not individual are using or expected to use the acquired skills in any situation.

The effectiveness of training can be assessed by looking at whether or not the training activities had the desired impact for the participant. Individuals usually participate in job-related training to improve their productivity, increase their career prospects or to find a new job. Across European OECD countries, 67% of participants state that training helped them achieve positive employment outcomes.2 This self-reported positive effect on employment outcomes is biggest in Portugal (80%), Italy (82%), Slovenia (82%) and Hungary (87%), and smallest in Germany, the Netherlands and Turkey (Figure 4.3). There is no clear relationship between the use of skill acquired through non-formal job-related training and the self-reported impact on employment outcomes. In some countries a large share of adults report that they use the acquired skills and that training had a positive impact on their employment outcome (e.g. Italy, Spain), while there are other countries where relatively few adults report a positive impact on employment outcomes even though many adults use the acquired skills (e.g. Turkey, Switzerland).

Figure 4.3. The usefulness, use and effectiveness of skills acquired in adult learning
% of participants in job-related adult learning
picture

Note: Data on usefulness refer to formal and non-formal job-related training, data on use and employment outcomes to non-formal job-related training only. The data on usefulness for Belgium refer to Flanders only, for the United Kingdom to England and Northern Ireland only.

Source: AES (2016), PIAAC (2012, 2015), WRTAL (Australia, 2016-17).

It is important to keep in mind that the self-reported effectiveness of training does not only reflect the quality of training, but also labour market conditions and other contextual factors more broadly.3 For example, in countries where the competition for jobs is fierce, training participation might have a much bigger impact on employment outcomes than in countries with relatively little competition for jobs. At the same time, even if the quality of training is high it may not have a very positive impact on employment outcomes in a situation of high unemployment. Moreover, perceptions might also be influenced by cultural factors, such as positive attitudes.

A more objective measure of the impact of training among workers is the wage returns to training. Wage returns are a measure of the impact of participation in adult education and training on the individuals’ wages. Controlling for a range of individual factors, Fialho, Quintini and Vandeweyer (2019[4]) find that the wage returns to participation in formal or non-formal job-related training are largest in Chile, Lithuania and Estonia and smallest in Greece, Denmark and Italy.4 While returns to training are a signal of its effectiveness, cross-country differences might also reflect the extent of flexibility in the wage-setting process. Differences in the returns could also reflect a different composition of training activities between countries, as some of them might focus more strongly on training that does not necessarily have an impact on wages (e.g. health and safety training).

4.3. Policies to ensure the that training has the desired impact

This section discusses two key areas to ensure that participation in adult learning has the desired impact on labour market outcomes: quality assurance and skill use. Adult learning systems are characterised by a large number of training programmes, delivered by a large number of training providers. In France, for example, more than 92 000 training providers are officially registered, many of which are very small (République Française, 2018[5]). In such a large and scattered market, strong monitoring and evaluation frameworks are essential to ensure quality of the provided training. It is also important that individuals, employers and institutions who want to participate in or provide training have access to sound information on the quality of different providers. With regards to the quality dimension, this section looks at how countries: i) assist training providers in offering high-quality programmes; ii) ensure high quality by certifying training providers and programmes; iii) measure the outcomes of training; and iv) share information on the outcomes of quality assessments with the general public. Countries can also achieve a larger impact of training by fostering better use of skills at the workplace, and this section describes how greater adoption by employers of high-performance working practices has been encouraged in different countries to ensure that skills are used optimally.

4.3.1. Guiding training providers to offer high-quality programmes

Measuring the quality of training is not easy, not even for training providers themselves, as quality is multi-dimensional and often subjective. Training providers could therefore benefit from support in implementing quality measures and monitoring and evaluation systems. This type of support is available in some countries, in the form of: i) guidelines, criteria and quality standards; ii) training to improve the knowledge about quality among training providers; and iii) support materials for training providers, such as good practice examples and self-evaluation tools.

Guidelines, criteria and quality standards can form the basis of a framework against which to evaluate the quality of training. Providing training providers with guidelines will help them understand what is considered quality training provision and how it is measured. In Japan, guidelines for vocational training services at private providers were developed in 2011. The guidelines present specific measures to improve the quality of vocational training services and management of private providers based on an international quality standard.5 Training accredited by the Department for Adult Training (Service de la Formation des Adultes) in Luxembourg has to follow quality criteria in the areas of i) equal access, ii) transparency, and iii) trained teachers. The United States Workforce Innovation and Opportunity Act promotes quality in adult education and training activities through a system of performance indicators that holds States, local communities, and providers accountable for the learning and employment outcomes of participants.

An important step in having an effective quality assurance system is to build the capacity of staff in adult training institutions to have a good understanding of what quality is and how to monitor and assess it. In Japan, workshops are organised for training providers to get familiar with and better understand the quality guidelines. There have been discussions on making participation in these workshops compulsory for training providers that want to offer publicly funded training programmes. In Slovenia, a training programme was developed by the Slovenian Institute for Adult Education (SIAE) for individuals to become quality counsellors in adult education. Training providers who want to improve their quality management system can have one or more staff members participate in the training or hire a qualified quality counsellor.

Giving training providers access to support materials can also help them develop their quality systems. In Italy, the group involved in the Action Plan for Innovation in Adult Learning (PAIDEIA) disseminates good practices in terms of quality among training providers. In Slovenia, good practices, tools and recommendations are made available on an online platform (Mozaik Kakovosti) with the goal of providing support for training providers who are developing an internal quality system. In Finland, on top of carrying out evaluations, the Finnish Education Evaluation Centre (FINEEC) is tasked with supporting education and training providers in issues related to evaluation and quality assurance. In this respect, the centre formulates evaluation methods and indicators that education providers can use in self-evaluation and peer reviews. FINEEC also supports the development of an evaluation culture among education and training providers and promotes the spreading of good practices (FINEEC, 2016[6]). In Denmark, a self-evaluation tool (VisKvalitet) is available for training providers to help measure participants’ satisfaction and learning outcomes, as well as the satisfaction of employers whose employees have participated in training programmes. The use of the tool has been made compulsory for continuing vocational education and training providers. The tool gives flexibility to training providers to add questions in addition to the mandatory ones.

4.3.2. Accreditation and quality labels

To guarantee that training providers and programmes comply with minimum quality requirements, many countries have put in place certification mechanisms or quality labels. Both can serve as signals of quality to help individuals, employers and institutions make informed choices about training investments. In some countries, publicly funded training programmes can only be delivered by certified providers as a way to ensure that the quality of training is up to standards.

Institutions in charge of quality control can certify training providers and programmes which have passed a quality evaluation. In Germany, a nationwide certification process for adult learning provision was introduced in 2012. Providers now have to be certified by specific bodies (Fachkundige Stellen, FKS) if they want to carry out employment promotion measures themselves or have them carried out on their behalf. The German Accreditation Body (Deutsche Akkreditierungsstelle, DakkS) is in charge of accrediting the certification bodies to guarantee their quality. In Japan, from 2018 onwards, training providers who comply with the quality guidelines will be certified. Compliance will be assessed on the basis of documents submitted by the training providers and on-site visits. Training providers in Korea wishing to deliver government-funded training programmes need to be certified. The duration for which certification is granted depends on the outcome of the quality evaluation (see below). In Chile, providers of PES-financed training have to adhere to a quality norm that was set in 2015. Certification based on this quality norm is done by private entities (Organismos certificadores de servicios), which in turn are supervised by a public entity (Instituto Nacional de Normas). When the norm started to be enforced in 2017, this led to the closure of around 800 training providers. In Romania, adult vocational training providers need to be accredited if they want to deliver nationally recognised certificates. The accreditation is based on quality criteria and is carried out by tripartite authorization commissions (composed of representatives from the Ministry of Labour and Social Justice, the Ministry of National Education, the National Agency for Employment, and the social partners). Providers are accredited for four years and monitored throughout this period.

In a similar vein, but generally on a more voluntary basis, quality labels can be granted to training providers or programmes for signalling reasons. In Austria, the nationwide Ö-Cert quality label was introduced in 2012 to bring transparency to customers and to serve as a quality standard for granting funds. Ö-Cert works as an umbrella label: it recognises existing Quality Management Systems and, in addition, providers have to fulfil the Ö-Cert-basic requirements. The accreditation is done by a group of independent experts. In Switzerland, the responsibility of quality assurance and development lies with the training providers themselves. A range of quality labels are available for training providers to signal their quality. The Slovenian Institute for Adult Education (SIAE) has developed a set of tools to incentivise training providers to implement a culture of quality, including a green quality logo that is granted to providers for continuous and systematic work on quality. The providers must prove that they systematically carry out self-evaluation exercises to be granted the quality logo. In British Columbia (Canada) post-secondary education and training providers can obtain an Education Quality Assurance (EQA) label to show that they meet or exceed quality standards set by the provincial government. These quality standards go beyond what is required by legislation, regulatory bodies and accreditation processes.

4.3.3. Monitoring and evaluating outcomes

Evaluating the quality of training programmes and providers can be a challenging task, as evaluation exercises require information on many different aspects. Effectiveness of training is generally measured by looking at training outcomes, such as labour market entry, or satisfaction with the provided training. These outcomes can be assessed through a variety of monitoring and evaluation methods, implemented either by external quality assurance bodies or internally through self-evaluations of training providers.

Types of quality measures

The assessment of outcomes of training is a common way of measuring the quality of training providers and programmes. Training outcomes are often assessed by looking at the labour market integration of participants. In Lithuania, for example, PES-implemented training programmes are assessed on their effectiveness in terms of short-term and long-term entry of participants into employment. Similarly, the Ministry of Economic Development and Employment in Finland measures the effectiveness of training by tracking the labour market status of training participants at different points in time following their participation. In Ireland, outcome assessments are an important part of training evaluation, and a new data system (Programme and Learner Support System) has recently been implemented to enable enhanced tracking of learner outcomes and more informed funding decisions. The system uses the national Further Education and Training course calendar, national course database and learner database to track learners’ lifecycles, including application, interview, start, completion and certification (and early leaving). However, these evaluations of outcomes do not capture the effectiveness of training in improving employment outcomes as against some control group who did not undergo the training (see the discussion below on impact evaluations).

A more subjective way to measure quality is the satisfaction of participants with the provided training, which is generally measured through surveys during and/or after training participation. In the Brussels capital region (Belgium), the results from user satisfaction surveys are part of the quality evaluation done by Bruxelles Formation, the organisation in charge of adult learning for the French-speaking population in Brussels. They aim to have an average satisfaction level of at least eight out of ten. In Finland, student surveys are run during and right after every PES-funded training programme, and this information feeds into the evaluation process.

Methods to assess quality and the role of external bodies

The assessment of quality of training providers and programmes can be assigned to external quality bodies that assess quality through inspections. In Norway, the agency for lifelong learning (SkillsNorway) is in charge of the inspections of adult learning provided in study associations and under the publicly-funded training programme for basic working life skills (SkillsPlus). A negative finding from an inspection can result in an order to make changes, but also in withdrawal of public funding and/or an obligation to pay back received public funding. Often these external quality bodies use a wide range of information sources in addition to results from inspection visits to assess the quality of training providers or programmes:

  • The Korean Skills Quality Authority (KSQA) is in charge of the evaluation of vocational training providers, training programmes and trainees. The KSQA conducts an in-depth evaluation of institutions, including on financial soundness, capability to provide training and training performance, and grants certified grades based on the evaluation outcomes. These grades are necessary to provide government-funded training, and better performing institutions receive grades that are valid for longer periods (up to five years). The KSQA also screens training programmes in terms of content, methods, teacher quality, facilities and equipment, and past training outcomes. For the evaluation of the trainees, the KSQA assesses whether the participants who completed training courses have acquired the expected skills. Courses that have positive outcomes in the trainee evaluation can receive additional financial support. The results from the trainee evaluation also feed into the training providers’ evaluation.

  • In England, the Office for Standards in Education, Children’s Services and Skills (Ofsted) grades training providers based on their overall effectiveness, with a focus on: i) the effectiveness of leadership and management; ii) the quality of teaching, learning and assessment; iii) personal development, behaviour and welfare; and iv) outcomes for learners. Inspection judgements are based primarily on first-hand evidence gathered during on-site inspections, but inspectors also consult a range of publicly available data on learners’ and apprentices’ progress and achievement, and have access to a wide range of other information (including self-assessment reports of the providers). The criteria used by inspectors are laid out in the Further Education and Skills Inspection Handbook. Independent training providers who are judged to be inadequate will generally no longer receive funding from the Education and Skills Funding Agency. For Future Education Colleges a negative review will lead to the development of a Notice to Improve, which sets out the conditions that the college must meet in a time bound period in order to receive continued funding.

An alternative strategy to monitor and evaluate the performance of training providers is through self-evaluation. In Slovenia, self-evaluation is commonly used among education and training providers. A framework for offering quality education to adults was introduced for adult learning providers in 2001, and this can be used for self-evaluation of entire institutions or specific programmes. The 2018 Adult Education Act states that all adult education providers should have an internal quality system that includes ongoing monitoring and in-depth self-evaluation. Information on how providers conduct their self-evaluations has to be made publicly available. The Brazilian e-Tec training programmes involve all relevant actors in the self-assessment exercises: students, tutors, teachers and coordinators. They evaluate the training programmes, teaching quality and quality of the learning environment. In Portugal, the Qualifica Centres, which provide guidance and RPL support, have to submit information on enrolment, referral to education and training pathways and RPL activities to the National Agency for Qualification and Vocational Education (ANQEP), which analyses the information and sends it back to the centres in an effort to encourage self-evaluation.

A more rigorous method to measure the effectiveness of adult learning programmes is the use of impact evaluations. Impact evaluations can be done by a variety of actors, including training providers, public institutions and academic researchers. The main difference between monitoring outcomes and a real impact evaluation is that the latter uses a counterfactual to estimate what part of the observed outcomes can be attributed to the training intervention (White, Sinha and Flanagan, 2006[7]). An impact evaluation of an adult learning programme would therefore generally compare the outcomes of training participants to the outcomes of similar adults who for non-systematic reasons did not participate in the training programme. Outcomes can be measured by a variety of indicators, including employment rates and earnings, depending on the goal of the training programme. As noted by Card, Kluve and Weber (2015[8]), the use of impact evaluations to assess active labour market programs, including training programmes, has increased significantly in recent decades.

Some countries have a strong impact evaluation culture, and in a few cases the evaluation of programmes is fixed in legislation. In Germany, for example, the implementation of the 2003-05 reforms to active and passive labour market policies (often referred to as the Hartz reforms) was explicitly tied to an evaluation mandate. The evidence shows that the re-design of training programmes increased their effectiveness (Jacobi and Kluve, 2006[9]). In Australia, a Try, Test and Learn Fund was set up in 2016 under the Australian Priority Investment Approach to Welfare. This Fund is used for trialling new approaches to moving at-risk income support recipients onto a pathway towards employment, evaluating these approaches using a range of evaluation methods, and learning from the results. Many of the initiatives that are being trialled in the first tranche of the Try, Test and Learn Fund are training programmes for young carers, young parents and students at risk of moving to long-term unemployment, and unemployed former students (Australian Government - Department of Social Services, 2018[10]). The European Social Fund, which funds local, regional and national employment-related projects throughout Europe, made it compulsory in the 2014-20 programming period to assess to what extent the objectives have been achieved (European Commission, 2015[11]). The managers of the projects are free to choose the most suitable method to carry out the impact evaluation, and a practical guidance report on how to design and commission counterfactual impact evaluations was made available by the European Commission (European Commission, 2013[12]). That being said, the use of impact evaluations remains rare in many countries and in specific areas of adult learning. Robust evidence on the effectiveness of training levies, for example, is very uncommon (Müller and Behringer, 2012[13]).

4.3.4. Sharing information on quality

For individuals, employers and institutions to be able to make informed choices about which training to invest in, they need to have access to relevant and up-to-date information on the quality of different training providers and programmes. Certification and quality labels can serve as signals of quality, but training providers can also share more in-depth information on evaluations, learning outcomes and user satisfaction with the general public to help them decide which training to invest in. This information should ideally be easily accessible, presented in a user-friendly format.

In some countries, quality assurance bodies make the results from evaluations publicly available. In Norway, for example, Skills Plus makes the results from inspections of Skills Plus programmes and adult training in study associations available on its website. In the United Kingdom, the Department for Education publishes summary tables of outcome-based success measures, including sustained employment and learning rates, by provider on its website. In France, certain public institutions that finance training have to review the quality of the training providers they work with, and make the outcomes from the review process publicly available. For training providers that do not hold a specific quality label, the review consists of an evaluation of six quality criteria, including education and training of teachers and sharing of information on training outcomes. Training providers that comply with the criteria are registered in an online database accessible to financers of training (DataDock). In some countries that make use of self-evaluation systems it is compulsory to make the results publicly available. In Brazil, for example, the results from internal evaluations of the e-Tec programmes are published online. In Denmark, the results from self-evaluations through the national VisKvalitet tool are centralised and published online.

Table 4.2. National online databases on adult learning
Availability, coverage and quality information

 

Does it exist?

Coverage

Quality indication

Name

 

Training Programmes

Training Providers

Australia

Yes

x

x

x

My Skills

Austria

Yes

x

x

Weiterbildungsdatenbank

Belgium

Yes

x

x

VDAB-Vind een opleiding; Dorifor; Formapass

Canada

Yes

x

x

x

JobBank; InforouteFPT (Québec); Repères (Québec); EducationPlannerBC (British Columbia) 

Chile

No

 

Czech Republic

No

 

Denmark

Yes

x

x

UddannelsesGuiden

Estonia

Yes

x

x

HaridusSilm

Finland

Yes

x

x

Opintopolku

France

..

 

Germany

..

 

Greece

Yes

x

x

Ploigos

Hungary

Yes

x

x

Nemzeti Pályaorientációs Portál 

Iceland

Yes

x

Next Step

Ireland

Yes

x

x

Fetch Courses; Qualifax

Israel

..

 

Italy

No

 

Japan

Yes

x

x

http://course.jeed.or.jp/

Korea

Yes

x

x

x

HRD-Net

Latvia

Yes

x

x

x

webpage of the PES (NVA); NIID

Lithuania

Yes

x

x

webpage of the PES (LDB)

Luxembourg

Yes

x

x

lifelong-learning.lu

Mexico

Yes

..

..

..

RENAC

Netherlands

..

 

New Zealand

..

 

Norway

Yes

x

x

utdanning.no

Poland

Yes

x

x

x

Portugal

Yes

x

x

IEFPonline; Qualifica Portal

Slovak Republic

No

 

Slovenia

Yes

x

x

Kam po znanje

Spain

No

 

Sweden

No

 

Switzerland

Yes

x

x

orientation.ch

Turkey

No

 

United Kingdom

No

 

United States

Yes

x

x

Career One-Stop

Non-OECD countries 

Argentina

Yes

x

x

Formate en Red

Brazil

No

 

Romania

No

 

Note: ‘Quality indication’ refers to whether or not the database provides information on the quality of specific training programmes or providers (e.g. student satisfaction or labour market outcomes).

Source: OECD Adult Learning Policy Questionnaire.

As discussed in Chapter 2, online databases that provide details on existing training programmes can help individuals, employers and institutions make informed adult learning choices. In some cases, these databases also provide quality information, such as learning outcomes or user satisfaction. The Korean HRD-Net website provides a wealth of information for a wide range of different training programmes. In addition to basic information on the duration of the course, the costs and the average age of the participants, the website also provides information on the employment rate and average wages of the graduates from the programmes. Also, it shows the satisfaction of participants, on a range of zero to five stars, and their reviews. Australia’s national directory of vocational education and training providers and courses (www.myskills.gov.au) allows users to search VET qualifications by industry and access information about average course fees, course duration, available subsidies and average employment outcomes. While employment outcomes are currently available by qualification, a plan exists to make them available at the provider level. Table 4.2 provides an overview of the main available databases on adult learning, including whether or not they provide information on the quality of programmes and providers.6

4.3.5. Fostering skill use at work

For newly developed skills to have an impact on labour market outcomes, they have to be put to good use. Evidence shows that workers who make better use of their skills also earn higher wages and have higher job-satisfaction (OECD, 2016[3]), and that they reap larger benefits from participation in adult learning (Fialho, Quintini and Vandeweyer, 2019[4]). At the firm level, high skill use is associated with higher productivity. What happens inside the workplace – the way work is organised and jobs are designed as well as the management practices adopted by the firm – is a key determinant of how skills are used. In particular, it has been argued that better skill use and higher productivity can be achieved by implementing so-called High-Performance Work Practices (HPWP) (OECD, 2016[3]). These practices include aspects of work organisation, like team work, autonomy, task discretion, mentoring, job rotation, applying new learning, and management practices (e.g. employee participation, incentive pay and flexibility in working hours). The use of HPWP is more common among large firms than in SMEs, and high-skilled workers are more likely to be engaged in HPWP than less-skilled workers. The countries that use HPWP most intensively are Denmark, Finland and Sweden, whereas these practices are least common in Greece and Turkey (OECD, 2016[3]).

Many countries have undertaken policy initiatives to promote better skills utilisation through workplace innovation. The background to most interventions is the recognition that many firms, if offered expert advice and encouragement to adopt more effective managerial practices, can better utilise existing skills and reap the productivity gains, increasing returns to training for all. Many of these initiatives have focused on raising awareness of the benefits of better skills use, disseminating good practice and sharing expert advice. Employment New Zealand has published a Flexible Work Toolkit to help SMEs understand and manage flexible work with practical tips and tools. Also in New Zealand, Callaghan Innovation (i.e. New Zealand’s Innovation Agency) has a high performance working initiative that coaches enterprises to be higher performers through effective employee engagement and improved workplace practices.

Tax incentives and subsidies can be leveraged to incentivise and support firms in adopting HPWP, especially considering that some firms may not have the incentive or financial capacity to promote workplace innovation. The Liideri programme of the Finnish Funding Agency for Technology and Innovation (Tekes) funds projects within companies to renew their operations through developing management principles and forms of working and actively utilising skills and competencies of their personnel. The focus areas of the programme are: i) management principles that help an organisation promote initiative, creativity and innovation potential of personnel; ii) employee-driven innovation; and iii) new ways of working.

Largely, a firm’s ability to implement and benefit from HPWP will depend on the quality of its managers to implement changes in work practices in a productive way. Low management skills can be a bottleneck to workplace innovation. Policies that seek to promote the development of HPWP may need to be accompanied by management skill development programmes. Employer networks often provide these types of leadership and management skills programmes, in addition to their role as facilitators of knowledge exchange. In some countries, government-supported management training programmes are available to employers, often with a focus on SMEs. In the United Kingdom, for example, eight innovative projects to develop leadership and entrepreneurship skills in SMEs received government support as part of the UK Futures Programme.

In some countries, the adoption of working practices that promote better skills use is facilitated by the existence of a strong dialogue between workers and employers – and the latter can be influenced by government action. In most of the Nordic countries, but also in Germany and the Netherlands where the use of flexible working arrangements is high, most workers are covered by collective agreements that stipulate rights to shorter working hours and/or to flexible working. Governments can play an active role in the promotion of social dialogue on workplace flexibility. For instance, in Germany in 2011, the federal government and social partners signed the “Charter on Family-Oriented Working Hours” calling on all stakeholders to actively pursue the opportunities of innovative working-hour models in the best interest of the German economy (OECD, 2016[14]).

References

[10] Australian Government - Department of Social Services (2018), Try, Test and Learn Fund |, https://www.dss.gov.au/review-of-australias-welfare-system/australian-priority-investment-approach-to-welfare/try-test-and-learn-fund (accessed on 23 July 2018).

[2] Broek, S. and B. Buiskool (2013), Developing the adult learning sector: Quality in the Adult Learning Sector, Panteia, Zoetermeer.

[8] Card, D., J. Kluve and A. Weber (2015), “What works? A meta analysis of recent active labor market program evaluations”, NBER Working Paper Series, No. 21431, NBER, Cambridge, MA, http://www.nber.org/papers/w21431 (accessed on 23 July 2018).

[11] European Commission (2015), Monitoring and Evaluation of European Cohesion Policy (European Social Fund) - Guidance document, https://www.portugal2020.pt/Portal2020/Media/Default/Docs/AVALIACAO/4-ESF_ME_Guidance_Jun2015.pdf (accessed on 23 July 2018).

[12] European Commission (2013), Design and commissioning of counterfactual impact evaluations - A practical guidance for ESF managing authorities, Publication Office of the European Union, Luxembourg, https://publications.europa.eu/en/publication-detail/-/publication/f879a9c1-4e50-4a7b-954c-9a88d1be369c/language-en (accessed on 23 July 2018).

[4] Fialho, P., G. Quintini and M. Vandeweyer (2019), “Returns to different forms of job-related training: Factoring in informal learning”, OECD Social Employment and Migration working paper Forthcoming.

[9] Jacobi, L. and J. Kluve (2006), “Before and After the Hartz Reforms: The Performance of Active Labour Market Policy in Germany”, IZA Discussion Paper Series, No. 2100, IZA, Bonn, http://ftp.iza.org/dp2100.pdf (accessed on 25 July 2018).

[13] Müller, N. and F. Behringer (2012), “Subsidies and Levies as Policy Instruments to Encourage Employer-Provided Training”, OECD Education Working Papers, No. 80, OECD Publishing, Paris, https://doi.org/10.1787/5k97b083v1vb-en.

[14] OECD (2016), Be Flexible! Background brief on how workplace flexibility can help European employees to balance work and family, https://www.oecd.org/els/family/Be-Flexible-Backgrounder-Workplace-Flexibility.pdf (accessed on 27 November 2018).

[3] OECD (2016), OECD Employment Outlook 2016, OECD Publishing, Paris, https://dx.doi.org/10.1787/empl_outlook-2016-en.

[1] OECD (2005), Promoting Adult Learning, http://www.oecd.org (accessed on 26 July 2018).

[5] République Française (2018), Liste Publique des Organismes de Formation (L.6351-7-1 du Code du Travail), https://www.data.gouv.fr/fr/datasets/liste-publique-des-organismes-de-formation-l-6351-7-1-du-code-du-travail/ (accessed on 19 June 2018).

Notes

← 1. Adults are said to use or expect to use their acquired skills when they report using or expecting to use a lot or a fair amount of the skills acquired in formal or non-formal job-related training. In the Australian data the definition differs slightly, and adults are said to use or expect to use the acquired skills when they report using the skills sometimes, often or always.

← 2. Positive employment outcomes are defined as getting a (new) job, higher salary/wages, promotion in the job, new task, better performance in the present job.

← 3. Correcting the indicators of self-reported impact for personal characteristics, such as education level, age and gender, has a limited impact on the ranking of countries.

← 4. Measuring the returns to training is not a straightforward exercise, as there are many factors that influence an individuals’ wage and his/her probability of participating in training. The wage returns included in the dashboard are estimated using a regression that is corrected for selection bias (taking into account motivation to learn). See Fialho, Quintini and Vandeweyer (2019[4]) for more details.

← 5. The ISO29990: Learning services for non-formal education and training – Basic requirements for service providers

← 6. The information provided in Table 4.2 refers to national-level databases only (with the exception of Belgium and Canada, where the responsibility for this lies at the regional level). In some countries, like in Sweden, databases are available at the local level, but this information is not included in the table.

End of the section – Back to iLibrary publication page