Chapter 6. Research

This chapter looks at the performance of higher education research and development. It covers the financial and human resources that are allocated uniquely to research, the distribution of research expenditure, the profile of research personnel, access to research careers, the profile of doctorate holders, research activity, internationalisation, research productivity and impact.

    

The statistical data for Israel are supplied by and under the responsibility of the relevant Israeli authorities. The use of such data by the OECD is without prejudice to the status of the Golan Heights, East Jerusalem and Israeli settlements in the West Bank under the terms of international law.

6.1. Introduction

Research and development (R&D) is one of the three key missions of higher education institutions. As defined in the Frascati Manual (OECD, 2015[1]), R&D comprises basic research, which is aimed at creating new knowledge with no specific application in view; applied research, which is aimed at creating new knowledge towards a specific practical aim; and experimental development, which has the goal of developing new products or processes.

Higher education institutions carry out all three forms of R&D. As discussed in Chapter 1, there has been a substantial expansion in research and experimental development activity across the OECD in recent years. The volumes of R&D investment and output are on strong growth trajectories in many countries, notwithstanding a reduction in expenditure in many cases following the economic crisis.

However, measuring the return on investment in research and development can be problematic, regardless of whether the return relates to economic or social gain. Indeed, the level of capacity within individual higher education systems to assess and compare the quality and volume of their research output is far from clear.

This chapter looks at how successful higher education systems are in terms of ensuring a strong foundation for investment in R&D expenditure, providing equitable opportunities and attractive working conditions for researchers, and producing high quality research.

6.1.1. Research systems and strategies

A strong framework for systematically creating and diffusing knowledge is a key pillar of any innovation strategy (OECD, 2015[2]). Public research plays a vital role in delivering innovations that have social and economic benefits. Research activities carried out in the public higher education sector, along with the activity of public research institutes (PRIs) compose the public research system. Public research systems are organised differently in the participating jurisdictions (Box 6.1). Overall, three-quarters of total basic research is carried out in the public research system, even though public R&D only accounts for 30% of the overall volume of R&D in the OECD (OECD, 2016[3]).

No consensus has yet emerged on how the quality of research can be measured, how efficient higher education R&D is at driving innovation, and how research infrastructure can be designed and funded most effectively to meet the needs of economies and societies. The traditional role of public research has been to ensure research and development in areas that have long term possibilities for societal value although they may not provide an immediate economic gain. Currently, there are increasing expectations on public research systems to transfer knowledge and increase the impact of research (OECD, 2016[3]).

As research and development activity has expanded, OECD governments are increasingly developing specific strategies covering public research and innovation. Each of the participating jurisdictions also has specific plans with measures aiming to improve the performance of research and innovation.

Box 6.1. Public research systems in the participating jurisdictions

As of 2017, the main actors in the research system of Estonia are the six public universities. Of these institutions, Tartu University and Tallinn University of Technology receive the largest share of public funding and have the highest number of students and staff (Kattel and Stamenov, 2017[4]). In addition, there are seven public research organisations and seven private R&D institutions (including one private university) that play an important role in the research system.

In Norway, the public research system includes universities and university colleges, research institutes and hospitals (health trusts). The Research Council of Norway (RCN) funds research over the whole range of R&D activities, and assumes an advisory role to the government in research policy matters. The council also funds the establishment and operation of specially designated research centres which carry out specific functions, such as Centres of Excellence (SFF) in specific fields of science, Centres for Research-based Innovation (SFI), and Centres for Environment-friendly Energy Research (FME).

In the Netherlands, universities carry out the majority of public research, though in recent years there has been some increase in practice-oriented research at professional HEIs. Public research institutes consist of scientific research institutes that are under the Netherlands Organisation for Scientific Research (NWO) and the Royal Netherlands Academy of Arts and Sciences (KNAW); government laboratories; and applied research (TO2) institutes, the latter of which are the most significant of the public research institutes in terms of expenditure (OECD, 2014[5]).

Research in the public system of the Flemish Community is carried out by higher education institutions and four Strategic Research Centres (SRC). There are also a number of additional scientific institutes, knowledge institutes and policy research centres. Each Strategic Research Centre focuses on one key specific area of research (nanotechnology, biotechnology, automotive and machine production, and multidisciplinary research); centres are also active in the commercialisation of their research. Belgium also has ten federal scientific establishments, which often conduct research in partnership with universities in the Flemish and French Community (Flemish Department of Economy, Science and Innovation, 2017[6]).

In Norway, the Long-term Plan for Research and Higher Education 2019–2028 sets the priorities for Norwegian higher education over the period. The government aims to further increase investment in higher education over the period and also work to facilitate the greater use of knowledge. Key measures of the plan related to R&D are an investment package to improve technology (including increasing basic research in ICT and building an e-infrastructure for open research), boosting the role of R&D for renewal and restructuring of the business sector (including expanding researcher education in new business creation), and increasing commercialisation, research-based innovation and business-oriented research (Norwegian Ministry of Education and Research, 2018[7]).

The Netherlands has set out a 2025 Vision for Science: Choices for the Future (Dutch Ministry of Education, Culture and Science, 2014[8]), which aims to consolidate the Dutch position as a world leader in research and ensure that the system can evolve to maintain its position amid emerging challenges. Specific commitments include considerable investment in research projects which attract Horizon 2020 funding, and the development of a National Research Agenda (NWA) to set priorities. The policy note Curious and committed: the value of science further elaborated on the 2025 vision, particularly in terms of policy initiatives (Dutch Ministry of Education, Culture and Science, 2019[9]). The Strategic Agenda for Higher Education, Research and Science 2015-2025 (Dutch Ministry of Education, Culture and Science, 2015[10]), also includes objectives to enhance research into higher education practices in order to improve education quality and build strong, permanent links between education, research and practice (for example, through Centres of Expertise to tackle the greatest societal challenges).

In the Flemish Community, the policy note Work, Economy, Science and Innovation 2014-2019 outlines the Flemish commitment to reach the EU 2020 target investment of 3% of its gross domestic product (GDP) in research and development, comprising 1% from government funding and 2% by the business sector. There is also increased focus on the participation of higher education institutions in European programmes such as European Research Council and Marie Curie, and aligning the Flemish research strategy with the European instruments (Flemish Government, 2014[11]).

The Estonian Research and Development and Innovation Strategy 2014-2020 sets goals for the system, including achieving the 3% EU 2020 GDP target, moving to 10th place on the EU Innovation Scoreboard, increasing the number of doctoral graduates and the impact of scientific publications. Estonia is also aiming to increase its share of EU research funding and become more active and visible in international research, development and innovation co-operation initiatives (Estonian Ministry of Education and Research, 2014[12]). Estonia also has particular goals in relation to the levels of investment in R&D by source, by targeting a level of investment of 2% of GDP from the private sector, with 1% of GDP coming from the state and local budget.

6.2. Investment in research and development

The combined expenditure of OECD countries on public R&D currently represents 65% of the global public R&D investment, though the growth of public science systems in emerging economies is likely to change the balance of expenditure in the years to come (OECD, 2016[3]). The higher education sector performs a substantial share of public research activity across OECD countries, and also plays a key role both in performing basic research and training researchers through doctoral education. Expenditure on R&D within higher education has been on a pattern of sustained growth, more than doubling since 1995, though growth has begun to slow in recent years (OECD, 2017[13]).

The policy arguments for investing in R&D are complex. The timelines as well as the economic and social payoffs of research projects are not always clear in advance at the level of individual investments, particularly when it comes to investment in basic research. However, investment in research creates value by improving the body of knowledge and new ideas from which the economy can draw to innovate, create new products and services and improve existing ones. This increased stock of knowledge can provide wider economic or social benefits through knowledge, market or network spill-overs (Georghiou, 2015[14]).

With the goal of promoting innovation high on the policy agenda in many OECD countries, investment in knowledge creation to feed into innovation is increasingly considered crucial. Indicators on the source, destination and distribution of expenditure can provide insight into how much governments are prioritising the R&D sector and which subsectors and types of research are attracting the majority of funding. The comparative data presented in this section focus on the key questions of how higher education expenditure relates to the broader R&D investment in countries, where investment comes from and how it is spent.

6.2.1. Higher education investment within the broader R&D sector

Gross domestic expenditure on research and experimental development (GERD) measures all intramural expenditure on research and development within a jurisdiction. It includes expenditure on R&D from outside the jurisdiction, but not domestic expenditure which is spent in another jurisdiction, and so provides a clear measure of the volume of expenditure on R&D within any one economy.

GERD is distributed among the four R&D-performing sectors: business enterprise, government, higher education and private non-profit, as defined by the Frascati manual (OECD, 2015[1]). Therefore, GERD encompasses expenditure on Higher Education R&D (HERD), expenditure on research in the government sector (GOVERD), business research and development expenditure (BERD) and expenditure in the private non-profit sector. Government policy and targets in R&D tend to focus on either the R&D sector as a whole, or the public research sector, rather than specifically focusing on higher education R&D.

Many countries across the OECD have set targets to increase GERD. For example, in line with the EU 2020 strategy for smart, sustainable and inclusive growth, European countries including Denmark, Germany and France envisage increasing GERD to 3% of GDP by 2020; while Finland, Sweden and Japan have set more ambitious spending targets of 4% of GDP by 2020 (OECD, 2014[15]). However, as can be seen from Figure 6.1, some OECD countries invest considerably more in R&D than others. For example, in Israel and Korea, GERD amounts to more than 4% of GDP; while Turkey, Latvia, Mexico and Chile spend less than 1% of GDP on R&D.

Figure 6.1. Gross domestic expenditure on R&D (2016)
As a percentage of GDP, overall and by performing sector
Figure 6.1. Gross domestic expenditure on R&D (2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Source: Adapted from OECD (2018[16]), OECD Science, Technology and R&D Statistics, https://doi.org/10.1787/strd-data-en.

 StatLink https://doi.org/10.1787/888933941234

Overall, GERD in the OECD area amounted to 1.9% of GDP in 2016, compared to 1.8% of GDP in 2006. At the level of individual countries, expenditure as a proportion of GDP increased in 23 of the 31 countries with available data for 2006 and 2016; with the most significant increases occurring in Austria (0.7% of GDP) and Korea (1.4% of GDP). Countries with decreasing investment over the period 2006-2016 include Canada, Finland and Luxembourg (Figure 6.1).

In Flanders, GERD is higher than the OECD average, with investment equivalent to 2.7% of GDP in 2016, while in the Netherlands and Norway, GERD was at approximately 2% of GDP. The Netherlands and Norway have moved steadily from below or at the average level of investment in 2006 to above average levels by 2016, and while comparable data for 2006 for Flanders are not available, Belgium was already slightly above the OECD average in 2006, with GERD as a proportion of GDP of 1.8%.

GERD patterns have been more volatile in Estonia in recent years, though it must be noted that in relatively small research systems, the ratio between GERD and GDP can be affected by single investments involving relatively large financial amounts. For example, R&D investments related to an Estonian oil shale refinery contributed to GERD reaching 2.3% of GDP in 2011 (from a 2005 level of 1.1%) and progressively decreasing since, reaching a level of 1.3% of GDP in 2016.

Business enterprise expenditure on research and development represents the largest portion of GERD, accounting for over 60% of R&D on average across the OECD (Figure 6.2). HERD is the next largest expenditure category, while GOVERD in OECD countries is lower on average than HERD. Overall, around 26% of GERD in 2016 was allocated to research undertaken by the higher education sector alone.

Figure 6.2. Gross domestic expenditure on R&D by performing sector (2016)
Figure 6.2. Gross domestic expenditure on R&D by performing sector (2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Data refer to 2016 or most recently available year.

Source: Adapted from OECD (2018[16]), OECD Science, Technology and R&D Statistics, https://doi.org/10.1787/strd-data-en.

 StatLink https://doi.org/10.1787/888933941253

Figure 6.2 shows that in all OECD countries except Hungary, Korea, Luxembourg, Mexico and Slovenia, the higher education sector was responsible for a larger proportion of R&D expenditure than the government sector in 2016.1 The proportion of expenditure on R&D performed by government was slightly above the OECD average in Flanders (11%), Estonia (11%) and the Netherlands (12%). In Norway, approximately 14% of R&D was undertaken by the government. However, although the government sector is a relatively minor performer in research and experimental development, it represents a major source of funding of R&D undertaken by the higher education and business sectors (OECD, 2015[17]).

In Flanders the business enterprise sector and the private non-profit sector represented almost 70% of GERD in 2016. The business enterprise sector provided around 50% of GERD in Estonia, the Netherlands and Norway, implying that HERD and GOVERD are more important in these jurisdictions. The higher education sector is particularly important in Estonia; in 2016 it was responsible for around 40% of expenditure.

As Figure 6.2 shows, the higher education sector has been attracting an increasing proportion of GERD in recent years in many countries, even as GERD itself also expands. For example, Portugal increased the proportion of GERD allocated to the higher education sector by more than 10 percentage points between 2006 and 2016. In other countries, however, such as Greece, Hungary and Turkey, the proportion of GERD allocated to higher education has been falling. In the Netherlands and Norway, the proportion of GERD spent in the higher education sector in 2016 was similar to 2006 levels.

6.2.2. Sources of funding for higher education research and development

Higher education draws on various domestic and international funding sources for R&D activities (OECD, 2015[17]). While R&D activities in higher education may be to some extent funded by internal funds (e.g. income from endowments or student fees), the majority of funding comes from outside the higher education sector. Given the pressures of financing higher education faced by the public sector (see Chapter 1), higher education institutions are increasingly seeking to diversify sources of R&D funding, as well as other higher education activities. This section assesses how well-diversified the funding sources are for R&D across OECD higher education systems.

On average, across OECD countries with available data, R&D undertaken by higher education in 2016 was, for the most part, heavily financed by the government sector (68%), followed by funding from within the higher education sector itself (12%), funding from abroad (12%), business enterprises (6%), and the private non-profit sector (3%). However, some systems are also able to raise funding from the business enterprise sector, such as Germany (14% of overall funding) or Korea (13% of overall funding) (Figure 6.3).

Government funding accounted for more than two-thirds of HERD in Belgium, Estonia and the Netherlands, and close to 90% of HERD in Norway. Funding from abroad is the second largest source of funding of HERD in Estonia (15%), while the business enterprise sector is the second largest source of funding in Belgium (13%). In the Netherlands, 8% of HERD in 2016 was financed from abroad and another 8% came from the business enterprise sector. With 3% of HERD originating from the business sector, Norway had the lowest contribution from business among jurisdictions participating in the benchmarking exercise in 2016.

Compared to other sources of funding for HERD, the contribution of the business sector is relatively small (5% of HERD on average across the OECD in 2016). However, these figures may understate the full extent of businesses’ overall contribution to HERD, which can also involve payments for the use of facilities or outcomes of R&D such as licensing income or investment in spin-offs.

Figure 6.3. Expenditure on research undertaken by the higher education sector, by source of funding (2016)
As a percentage of total funds
Figure 6.3. Expenditure on research undertaken by the higher education sector, by source of funding (2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Data refer to 2016 or most recently available year.

Source: Adapted from OECD (2018[16]), OECD Science, Technology and R&D Statistics, https://doi.org/10.1787/strd-data-en.

 StatLink https://doi.org/10.1787/888933941272

In addition to contributions from businesses, funding from private non-profit organisations is an important indicator of engagement in R&D performed by the higher education sector. In some countries, such as Denmark, Sweden and the United Kingdom, the contribution of private non-profit organisations to HERD far exceeds that of the business sector. However, in the four participating jurisdictions, private non-profit funding in higher education is not a substantial source of funding; while it was the source of more than 6% of funding in the Netherlands in 2016, it made up less than 4% of funding in Norway and less than 1% in Belgium and Estonia.

When compared to the OECD average, the higher education sectors in Belgium, Estonia, the Netherlands and Norway contribute less funding to support R&D undertaken by higher education. This may be related to relatively low availability of internal funds (e.g. income from endowments or student fees) within the higher education sectors of the participating jurisdictions, compared to some other OECD countries.

Disparities of funding from different sources can be related to the funding mechanisms in place for research in particular country contexts; while some systems may fund R&D from general institutional funds, in other cases institutions may receive a specific allocation of R&D funding from government. Differences are also related to the relative availability of funding from different sources. For example, European countries are eligible to apply for targeted R&D funding from the European Union, so they may have more capacity to attract funding from abroad. In other countries, notably Canada, Sweden, the United Kingdom and the United States, the private non-profit sector is an important source of funds.

Table 6.1 summarises the key funding mechanisms for each of the four participating jurisdictions. As can be seen from the table, performance-based formula funding and competitive funding mechanisms for R&D, as well as block grant funding, are in place in all jurisdictions. For example, in the Flemish Community, in addition to the block grant funding for research provided by the Department of Education and Training, higher education institutions can receive special research funding from the Department of Economy, Science and Innovation, which is provided based on performance (Jonkers and Zacharewicz, 2016[18]). These “Special Research Funds” (BOF), are awarded based on the number of master’s and doctoral degrees awarded, gender diversity, and research productivity and impact. Institutions can also benefit from “Industrial Research Funds” (IOF) if they engage in technology transfer activities such as licensing, patenting and spin-offs.

The Netherlands directs a special stream of funding towards practice-oriented research as part of the funding allocated to the professional HEI sector. This stream of funding can be used to appoint associate professors (lectors) who specialise in developing research projects in conjunction with stakeholders, which serves their mutual interest. In addition, competitive funding is available for professional HEIs to establish Centres of Expertise, public-private partnerships set up to encourage partnership between higher education institutions, industry and government. Most of the Centres of Expertise are affiliated with one of the “top sectors”, key sectors of importance to the Dutch economy (Section 6.7).

Funding from international sources

A number of countries rely heavily on funding from abroad to finance higher education R&D, including from international organisations and supranational entities. In five of the countries for which data was available for 2016, funding from international sources represented over one-fifth of total funding, ranging from 23% of funding in Poland to over 56% of funding in the Slovak Republic (Figure 6.3). However, for EU countries, some of the differences between countries can also be related to how funding from European Structural Funds is accounted for in budgets. In some countries, it may be classified directly as funding from abroad, while in others it may be incorporated into national funds before being allocated, meaning it is then classified as government funds.

Table 6.1. Types of funding for R&D in the participating jurisdictions

Estonia

The Flemish Community

The Netherlands

Norway

Universities

Professional HEIs

Base funding

Yes (provided by the Ministry of Education and Research to R&D institutions that received a positive evaluation)

Yes (provided by the Department of Education and Training)

Yes (part of the block grant where fixed allocations constitute 58%, another 5% is allocated for doctoral training)

Yes (to support practice-oriented research, provided as part of the lump sum funding for professional HEIs)

Yes (constitutes 70% of the block grant without detailed specifications of its use)

Performance-based funding

Yes (base funding is performance-based)

Yes (provided by the Department of Economy, Science and Innovation through Special Research Funds and Industrial Research Funds)

Yes (part of the block grant is formula-based with performance elements, constitutes 37% of the block grant)

Research-related indicators are also included in the performance agreements

Yes (constitutes 6% of the block grant for HEIs provided based on performance)

Project- and/or programme-based competitive funding/research grants

Research grants for research groups, institutions or individuals

Yes (project-based funding provided by the Research Foundation)

Yes (competitive project- and programme-based funding provided by the Research Council and the Royal Academy of Sciences)

Yes (NWO competitive funds for practice-oriented research; supports knowledge exchange between SMEs and professional HEIs and the creation of Centres for Expertise)

Yes (competitive project-based funding, primarily provided by Research Council of Norway)

Funding to support research infrastructure

Yes

Yes (through the programme infrastructure of the Research Foundation)

Yes (in support of the “top sectors” activities)

Yes (it aims to increase appropriations to research infrastructure by NOK 400 million by 2018)

Indicators or other considerations attached to funding mechanisms

To be eligible for baseline funding, R&D institutions must have a positive evaluation in the regular government research evaluation process. In total, 95% of funding is awarded based on performance criteria (high level research publications, patents and patent applications, co-financing of R&D and doctoral graduates); and 5% is allocated to humanitarian research of national significance.

Special Research Funds are awarded based on number of master degrees, defended doctorates, gender diversity, publications and citations.

Industrial Research Funds are awarded based on defended doctorates, publications and citations, revenues from licences, revenues from EU contracts, patents and spin-off companies.

Formula-based funding (37% of the core R&D funding of universities) considers degrees and defended doctoral degrees.

Indicators in performance agreements include research contracts funded by research councils and the EU, scientific impact, scores in research assessment exercises, doctorate degrees awarded.

Competitive funding to support co-operation between professional HEIs and business.

Performance-based funding is awarded based on several indicators: including scientific production, student credits, degrees, exchange students, competitive funding from the research council and regional research funds, funding from the EU and other third-parties.

Source: Adapted from Jonkers and Zacharewics (2016[18]), Research Performance Based Funding Systems: a Comparative Assessment, https://doi.org/10.2760/70120; information provided by the participating jurisdictions. See the reader's guide for further information.

Figure 6.4. European Commission funding of government and higher education R&D in selected European countries (2015)
Share of government and higher education R&D funded by EC as a percentage
Figure 6.4. European Commission funding of government and higher education R&D in selected European countries (2015)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

For Austria, Belgium, Denmark, and Sweden, data refer to 2013. For Germany, France, Italy, Lithuania, Luxembourg, the Netherlands, Poland, Portugal and Slovenia, data refer to 2014.

Source: Adapted from OECD (2017[13]), OECD Science, Technology and Industry Scoreboard 2017: The digital transformation, https://doi.org/10.1787/9789264268821-en.

 StatLink https://doi.org/10.1787/888933941291

As can be seen in Figure 6.4, funding from international sources represents a small proportion of funding overall across OECD countries, although it tends to be more substantial for countries that are eligible to receive funding from the European Union (EU). Funds provided by the EU are especially important for R&D undertaken in a small group of European countries, reaching almost half the funding in the Slovak Republic in 2015. EC funding is also important for Estonia (15% in 2017), while it accounts for 7% of Belgian funding. On the other hand, Norway and the Netherlands have some of the lowest shares of their overall higher education R&D funding coming from the European Commission, at around 2% (Figure 6.4).

In recent years, countries have had varying rates of success in attracting R&D funding from EC sources (Table 6.2). Over the period 2014-2016, Belgium was the most successful of all European Union countries in successfully attracting funds from the Horizon 2020 framework programme for R&D, with an 18% success rate from almost 15 000 applications. The Netherlands and Norway also had relatively high success rates for their applications, at 17% and 16% of applications respectively. In Estonia, where approximately 2000 applications were submitted for funding over the period, there was a success rate of 13%.

Table 6.2. Success rates in attracting Horizon 2020 funding (2014-2016)

 

Number of applications

% of overall applications 2014-2016

Application success rate %

Belgium

14 840

3.7

18

Austria

9 705

2.4

17

France

30 660

7.7

17

Luxembourg

1 095

0.3

17

The Netherlands

22 226

5.6

17

Germany

44 811

11.2

16

Sweden

11 464

2.9

16

Norway

5 847

1.5

16

Denmark

8 981

2.2

15

Ireland

6 394

1.6

15

United Kingdom

49 412

12.4

15

The Czech Republic

4 385

1.1

14

Spain

42 403

10.6

14

Finland

8 671

2.2

14

Estonia

2 020

0.5

13

Greece

12 839

3.2

13

Portugal

9 521

2.4

13

The Slovak Republic

1 901

0.5

13

Italy

44 820

11.2

12

Lithuania

1 095

0.3

12

Latvia

1 419

0.4

12

Poland

7 901

2

12

Hungary

4 874

1.2

11

Slovenia

4 512

1.1

11

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Source: Adapted from European Commission (2018[19]), Horizon 2020 in full swing - Three Years On - Key facts and figures 2014-2016, https://doi.org/10.2777/778848.

6.2.3. How research and development funding is spent

Current and capital costs

Current expenditures in R&D are composed of labour costs of R&D personnel; other current costs used in R&D, such as services and items (including equipment) used and consumed within one year; and annual fees for the use of fixed assets. Capital costs cover the purchase of fixed assets such as land and buildings, machinery and equipment, capitalised computer software and other intellectual property products that are used in R&D for more than a year (OECD, 2015[1]). This increasingly includes electronic infrastructure such as data, computing and communications networks that are used within R&D systems or, in some fields of research, shared between systems (European Strategy Forum on Research Infrastructures Long-Term Sustainability Working Group, 2017[20]).

On average across the OECD in 2015, current costs represent 89% of GERD, and capital costs just 11%; though in many countries, the proportion of expenditure dedicated to current costs is above 90%. Research is intensive on human resources, and therefore labour costs are generally the largest component of current costs (OECD, 2015[1]).

Figure 6.5. Expenditure on R&D by type of cost (2015)
Figure 6.5. Expenditure on R&D by type of cost (2015)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Data refer to 2015 or most recently available year.

Source: Adapted from OECD (2018[16]), OECD Science, Technology and R&D Statistics, https://doi.org/10.1787/strd-data-en.

 StatLink https://doi.org/10.1787/888933941310

On average, the breakdown of HERD by type of cost does not differ greatly to the breakdown of GERD; overall around 12% of costs relate to capital expenditure (Figure 6.5). But capital costs can vary over time in countries according to national plans for building or improving physical structures. For example, in Latvia and Poland, capital expenditure represented more than 30% of HERD in 2015, which may indicate that these countries were investing in expanding their research infrastructure.

In the participating jurisdictions, varying levels of capital expenditure were evident in 2015. Estonia spent 15% of GERD and 17% of HERD on capital costs, significantly higher than the OECD average, which could reflect additional investments under the Estonian Research Infrastructures Roadmap (see below). Belgium, the Netherlands and Norway spent below the OECD average proportion on capital expenditure in 2015, amounting to approximately 8% in each of these jurisdictions. However, in general, capital expenditure in higher education tends to show some volatility over time, depending on the levels of investment in infrastructure required and priorities for expenditure.

Improving physical research infrastructure is a top priority for science technology and innovation policymakers in most OECD countries (OECD, 2017[13]). For example, in 2019, Estonia updated its Research Infrastructure Roadmap to improve existing infrastructure and create new facilities and equipment. The roadmap earmarks 17 research infrastructure projects of national importance for investment in the coming decade. Estonia is also involved in the development of 14 international research infrastructures. Norway also committed to increasing appropriations to research infrastructure by NOK 400 million over the period 2015-2018, and has a national roadmap for research infrastructure, which is updated biannually (Norwegian Ministry of Education and Research, 2018[7]).

Expenditure by type of R&D

Overall, applied research and experimental development account for approximately 75% of gross domestic expenditure on R&D on average in the OECD area, and for more than 80% in eleven countries, including Israel, Japan or Korea (Figure 6.6). On the other hand, on average across OECD countries with available data for 2015, approximately 53% of GERD in the higher education sector was allocated to basic research, followed by applied research (35%) and experimental development (10%), with marked differences across countries (Figure 6.6). This highlights the key role that higher education plays in conducting basic research across OECD countries.

The proportion of GERD allocated to basic research in 2015 was relatively low in Belgium (16%), while it is just above average in the Netherlands and Estonia, at around 27% for both jurisdictions. In Norway, the breakdown for GERD was 17% on basic research, 36% on applied research, and 40% on experimental development. In France, Luxembourg and Switzerland, basic research accounts for more than 70% of HERD. Other countries, such as the United Kingdom or Korea, tend to invest a lot more in applied research and experimental development in the higher education sector. While the Netherlands and Estonia also spend a slightly higher than average proportion of HERD on basic research (approximately 57%), the proportion of HERD in Belgium devoted to basic research was the lowest in OECD countries in 2015, making up less than 20% of spending (Figure 6.6).

Figure 6.6. Expenditure on R&D by type of R&D activity (2015)
Figure 6.6. Expenditure on R&D by type of R&D activity (2015)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Data refer to 2015 or the latest available year.

Source: Adapted from OECD (2018[16]), OECD Science, Technology and R&D Statistics, https://doi.org/10.1787/strd-data-en.

 StatLink https://doi.org/10.1787/888933941329

Many countries have set targets to increase expenditure on applied research in recent years, including in the participating jurisdictions. In line with the target to increase investment in R&D to 3% of GDP by 2020, the Flemish Community aims to increase funding for fundamental, basic and applied research at higher education institutions. For 2019, the Flemish Government has a budget increase of EUR 128 million for R&D. In 2015, Estonia established a new instrument to support the development of applied research in areas of smart specialisation. Approximately EUR 27 million will be allocated to support the development of business R&D and co-operation between higher education institutions and business (Kattel and Stamenov, 2017[4]).

However, it is important to ensure that the growth in applied research does not come at the expense of basic research, and that an appropriate balance of basic and applied research is maintained (OECD, 2008[21]). With the shift in emphasis in public research away from public research institutes and towards universities (OECD, 2016[3]), the higher education sector will continue to play the core role in ensuring that fields of knowledge that may hold social and cultural value, though not necessarily immediate economic value, are protected. At the same time, research universities face an increasing pressure to commercialise knowledge and earn income from sources other than public funds, which creates conflict with the traditional view that knowledge production and dissemination is a public good, and threatens to erode the position of basic research (Altbach, Reisberg and Rumbley, 2009[22]).

6.3. Profile of research and development personnel

Research and experimental development activities rely on the availability and high quality of R&D personnel, covering everyone employed directly in R&D activities, including researchers, technicians and other support staff (OECD, 2015[17]). Different ways of calculating the numbers of full-time equivalent research staff exist across countries, as countries do not always have the availability of information to make distinctions between research and other functions, according to the Frascati manual, or coverage may differ (for example, some, but not all countries include doctoral students as researchers) (OECD, 2017[13]).

6.3.1. Researcher numbers relative to the labour force

Researchers are “professionals engaged in the conception or creation of new knowledge. They conduct research and improve or develop concepts, theories, models, techniques instrumentation, software or operational methods” (OECD, 2015[1]). One way of comparing the supply of researchers to R&D systems is through measuring the numbers of researchers relevant to the size of the labour force. Across all research sectors, the highest numbers of full-time equivalent (FTE) researchers per one thousand people in the labour force in 2016 were found in the Nordic countries, Japan and Korea (Figure 6.7).

For the participating jurisdictions, the share of FTE researchers per one thousand of the working age population was slightly above the OECD average in Flanders (8) and the Netherlands (9) in 2016. Norway had one of the higher concentrations of FTE researchers in the same year, with 12 per one thousand people in the labour force. On the other hand, Estonia had 6 researchers per one thousand people in the labour force, lower than the OECD average.

Figure 6.7. Researchers in the labour force (2016)
Full-time equivalent researchers per 1 000 people in the labour force
Figure 6.7. Researchers in the labour force (2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Data refer to 2016 or most recently available year.

Source: Adapted from OECD (2018[16]), OECD Science, Technology and R&D Statistics, https://doi.org/10.1787/strd-data-en.

 StatLink https://doi.org/10.1787/888933941348

The low share of researchers in the Estonian workforce may be partly explained by an ageing population and outward migration, but also by a lack of funding and incentives to pursue a research career. A previous study also found that salaries for researchers were lower than the EU average (Kattel and Stamenov, 2017[4]). Moreover, the reliance on short-term, project-based funding may lead to precarious conditions for researchers. To address these challenges, Estonia is making use of European structural funds to develop research capacity (Kattel and Stamenov, 2017[4]). In addition, the government has been working to make funding for R&D more sustainable by increasing the share of recurrent funding to institutions so that the proportion of such funding to competitive research grants would be 50:50 (Jonkers and Zacharewicz, 2016[18]).

Well-designed human resources policies can play an important role in attracting talented human capital to the research profession. Adopting internationally agreed human resource principles into local policies can also act as an important signal to potential talent. For example, in the Flemish Community, almost all universities and other R&D institutions have obtained a ‘Human Resources Excellence in Research’ designation, or are close to obtaining this recognition. This designation indicates that the human resources policy for researchers in this jurisdiction is in line with the human resources strategy and principles of the European Charter and Code for Researchers (see Chapter 4).

6.3.2. Researchers by sector of employment

On average, around one-half of FTE researchers in OECD countries work in the higher education and government sectors, with 40% of all researchers working in higher education in 2016, and 11% in the government sector, though there are marked differences between countries (Figure 6.8). Higher education and government researchers combined account for less than 20% of total FTE researchers in Korea; while in Greece, Latvia and the Slovak Republic, higher education and government researchers combined represent at least 80% of the overall numbers.

Figure 6.8. Researchers by sector of employment (2016)
Full-time equivalent researchers as a percentage of national totals
Figure 6.8. Researchers by sector of employment (2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Data refer to 2016 or most recently available year.

Source: Adapted from OECD (2018[16]), OECD Science, Technology and R&D Statistics, https://doi.org/10.1787/strd-data-en.

 StatLink https://doi.org/10.1787/888933941367

Higher education researchers make up over half of all FTE researchers in Estonia, while the proportions are lower in the other participating jurisdictions; around 37% in Flanders and Norway and 28% in the Netherlands. The proportion of government researchers is also lower than average in Flanders at around 8%, while they are closer to the average (around 12%) in Estonia and the Netherlands, and make up 15% of researchers working in Norway.

Between 2005 and 2015, the share of researchers in higher education increased in Belgium and Estonia and remained unchanged in Norway. The Netherlands experienced a decrease in the proportion of higher education researchers by around 8 percentage points over the same time period. The smaller share of higher education researchers in the Netherlands may partly be explained by the presence of public research institutes, including applied research (TO2) institutes (Box 6.1).

6.3.3. Gender equality in the research and development workforce

Women now outnumber men in terms of enrolment at the bachelor’s and master’s levels, on average across the OECD, and gender parity in enrolment in doctoral education has almost been achieved, as overall women now make up 48% of new entrants to doctoral education (Section 6.4). However, some countries are lagging behind on gender equity in the research and development workforce, and women remain less represented in doctoral education in some fields of research, including engineering and science (OECD, 2015[17]). Other forms of gender inequality persist that are specific to the research and development sector; for example in higher education, women are also less likely to hold a senior academic position, be corresponding authors in research publications or manage a higher education institution (OECD, 2015[17]).

On average in OECD countries with available data, women account for around 40% of the total of full-time equivalent researchers in the government, higher education and private non-profit sectors. While this shows that gender parity has not yet been achieved in higher education, progress is more advanced than in the business enterprise sector, where overall in 2016 only around 23% of researchers were women. In Iceland, Latvia, Lithuania and Portugal, parity of male and female researchers in higher education has been achieved, while in the government sectors in Estonia, Poland, Portugal and Latvia there is now a larger proportion of female than male researchers. In Japan and Korea, while higher education has a larger female representation than other R&D sectors, still in 2016 less than 30% of higher education researchers were female (Figure 6.9).

Figure 6.9. Women researchers, overall and by sector of employment (2016)
As a percentage of total full-time equivalent researchers
Figure 6.9. Women researchers, overall and by sector of employment (2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Data refer to 2016 or most recently available year.

Source: Adapted from OECD (2018[16]), OECD Science, Technology and R&D Statistics, https://doi.org/10.1787/strd-data-en.

 StatLink https://doi.org/10.1787/888933941386

Norway’s proportion of researchers in higher education was approaching parity in 2016, with 47% of women researchers. Estonia and Flanders were also above the OECD average on this measure, with 44% of higher education female researchers. The Netherlands was slightly below the OECD average, with 40% of female researchers in higher education (Figure 6.9).

As discussed in Chapter 4, many countries have introduced policies aimed at increasing the participation of women in research careers. While there have undoubtedly been some advances in terms of increased participation, persistent challenges remain to be overcome before gender equity in research and development can become a reality (Box 6.2).

Box 6.2. Persistent barriers to gender equity related to research and development

A recent OECD and G20 review of the evidence base covering the position of women in the modern digital economy and society found that large inequalities still exist between men and women across many areas relevant to research and innovation. Findings include:

  • There is a systematic underrepresentation of women in ICT jobs, and top management positions in business and academia. For example, only 17% of scientists making a salary of more than USD 105 000 are women.

  • Women still account for only one-fifth of graduates in STEM subjects, and only make up 20% of corresponding authors on STEM publications.

  • Around 90% of innovative start-ups seeking venture capital funding are run by men. When women-owned start-ups do seek funding, they receive on average 23% less funding. Evidence indicates that this ratio can be improved when women are included in the management structure of venture capital firms.

  • While progress has been made in the number of patents filed by teams with at least one woman, overall 80% of patents filed at key intellectual property offices worldwide are filed by all-male teams.

Source: Borgonovi et al. (2018[23]), Empowering Women in the Digital Age; Where Do We Stand?, https://www.oecd.org/social/empowering-women-in-the-digital-age-brochure.pdf.

6.3.4. Researchers in higher education by field of science

Researchers in OECD countries work across a broad range of fields of science, though many countries tend to specialise more heavily in particular fields. Broad fields of science in this section are defined according to the ISCED 2011 classification (OECD/Eurostat/UNESCO Institute for Statistics, 2015[24]), though at a more granular level, new fields are constantly emerging as communities of researchers grow, new technologies develop and science becomes more specialised.

According to 2016 data, around one-quarter of higher education researchers across OECD countries with available data work in natural sciences (24%), while just over 20% of researchers work in engineering and technology and another 20% on social sciences. The medical and health sciences sector has 18% of researchers, while 12% are working in humanities and the arts, and just over 4% of researchers across the OECD area are in agricultural and veterinary sciences (Figure 6.10).

While a variety of patterns can be observed across countries, at least 50% of researchers in each country are working in STEM-related fields of natural sciences, engineering and technology, medical and health sciences, and agricultural and veterinary sciences. Estonia has the largest share of higher education researchers in natural sciences among OECD countries with available data, making up 39% of researchers, while on the other end of the scale, less than 10% of researchers in Turkey are working in areas related to natural sciences.

In Belgium, the distribution of higher education researchers across fields of science is similar to that of the OECD average. In the Netherlands and Norway, there is a particularly high proportion (more than 30%) of higher education researchers working in medical and health sciences.

Figure 6.10. Researchers in higher education by field of science (2016)
Figure 6.10. Researchers in higher education by field of science (2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Data refer to 2016 or most recently available year.

Source: Adapted from OECD (2018[16]), OECD Science, Technology and R&D Statistics, https://doi.org/10.1787/strd-data-en.

 StatLink https://doi.org/10.1787/888933941405

Differences between concentrations of researchers in different fields of science can relate to government policy goals or country specialisation in different sectors. In many countries, including the participating jurisdictions, governments have identified “key sectors” in which to focus R&D activity (Section 6.7).

Differences can also relate to the ways in which public research is distributed between the higher education sector and the government sector. As with higher education researchers, engineering and technology; medical and health sciences; and natural sciences are also the three most represented fields among government researchers in OECD countries with available data, the majority of which are in the natural sciences. But compared to the higher education sector, a smaller proportion of government researchers across the OECD are in the social sciences (11%); while a higher proportion (13%) are in agricultural and veterinary sciences, although differences between countries are substantial. In Ireland, for example, more than half of government researchers are in agricultural and veterinary sciences, while in Norway, one-quarter of government researchers are in the social sciences. (Figure 6.11).

Figure 6.11. Researchers in the government sector by field of science (2016)
Figure 6.11. Researchers in the government sector by field of science (2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Data refer to 2016 or most recently available year.

Source: Adapted from OECD (2018[16]), OECD Science, Technology and R&D Statistics, https://doi.org/10.1787/strd-data-en.

 StatLink https://doi.org/10.1787/888933941424

Estonia has one of the largest shares of government researchers in the humanities and the arts (38%). While engineering and technology is the second most represented field of science among higher education researchers in Estonia, it is the least represented field among government researchers. This reflects a historical division of roles between different sectors; following Estonian independence, many government research institutions merged with universities, whereas those institutions that carry out other functions in addition to research and development activities (e.g. the Estonian Literary Museum and the Institute of the Estonian Language) have tended to remain in the government sector.

In Belgium, on the other hand, the largest share of government researchers (40%) is in engineering and technology, a difference of almost 25 percentage points from the OECD average. And while social sciences is one of the fields that is least represented among government researchers in general across the OECD, it attracts the largest share of government researchers in Norway (25%).

6.3.5. Technicians and support staff

In addition to staff with research and field-specific expertise, other categories of skilled personnel are also required to support research activity, including personnel with ICT skills, administrative skills and those that can operate and maintain physical machinery related to research activities.

In the R&D sector, technicians and equivalent staff are defined as “persons whose main tasks require technical knowledge and experience in one or more fields of engineering, the physical and life sciences, or the social sciences, humanities and the arts. They participate in R&D by performing scientific and technical tasks involving the application of concepts, operational methods and the use of research equipment, normally under the supervision of researchers” (OECD, 2015, p. 163[1]).

The evidence presented in this section indicates the variety of human resource patterns in R&D across the OECD. The relative proportions of technicians and other support staff can depend on different methods of apportioning research-related tasks in different countries, or differences in the amount of applied research and experimental development carried out, which may require greater numbers of certain staff categories. Differences in the relative concentration of technicians and other support staff therefore reflect very different ways in which research is organised, as well as the variety of roles and responsibilities undertaken by staff working in research and development in different countries.

In the OECD countries with available data for 2016, there are on average 33 technicians for every 100 researchers. The ratio of technicians to researchers tends to be higher than average in the government sector (39 technicians per researcher) and lower than average in the higher education sector (19 technicians per researcher). Across countries, the ratio of technicians to researchers in higher education can range from less than 5 in the Slovak Republic and Ireland to as high as 69 in Chile (Figure 6.12).

Lower ratios of technicians working in higher education, compared with other sectors, is not unexpected given the fact that higher education performs a relatively high proportion of basic research in most countries. Applied research and experimental development are likely to require a higher ratio of technicians to researchers to perform the necessary tasks. However, with many higher education systems aiming to expand the volume of applied research, as well as an increasing use of physical infrastructures even for basic research (Section 6.2.3), the demand for research technicians and other associated staff in higher education is likely to increase in the future.

In Estonia, there was an overall ratio of 22 technicians to 100 researchers in 2016, though the ratio is higher in the government sector (45 per 100 researchers) and much lower in the higher education sector (13 per 100 researchers). The difference between the government and higher education sector was even higher in the Netherlands, with 42 technicians per 100 researchers in the government sector, and around 10 in the higher education sector, partly due to the presence of public research institutes (Box 6.1). Belgium also has a similar pattern to the Netherlands, with 46 technicians per 100 researchers in the government sector, and 15 in higher education, though their most recently available data refer to 2011.

Figure 6.12. Technicians to researchers (2016)
FTE technicians per 100 researchers, overall and by sector of employment
Figure 6.12. Technicians to researchers (2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Data refer to 2016 or most recently available year.

Source: Adapted from OECD (2018[16]), OECD Science, Technology and R&D Statistics, https://doi.org/10.1787/strd-data-en.

 StatLink https://doi.org/10.1787/888933941443

Other support staff include “skilled and unskilled craftsmen, and administrative, secretarial and clerical staff participating in R&D projects or directly associated with such projects” (OECD, 2015, p. 164[1]). According to 2016 data, the average ratio of other support staff in OECD countries with available data was 17 support staff to 100 researchers. As is the case with research technicians, this ratio is higher in the government sector (33 per 100 researchers), and slightly lower in the higher education sector (14 per 100 researchers), with marked differences between countries (Figure 6.13).

The ratio of other support staff to 100 researchers in higher education is more than 40 in Japan and the Netherlands, while the category appears to be almost non-existent in the United Kingdom, although the category does exist in other R&D sectors. The ratio of other support staff to 100 researchers in the government sector is over 50 in Germany, Greece, Ireland, Japan, Mexico and Turkey, with Mexico in particular having a very large proportion of both other support staff and technicians in the government sector (60 other support staff and 74 technicians per 100 researchers).

Figure 6.13. Other support staff to researchers (2016)
FTE other support staff per 100 researchers, overall and by sector of employment
Figure 6.13. Other support staff to researchers (2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Data refer to 2016 or most recently available year.

Source: Adapted from OECD (2018[16]), OECD Science, Technology and R&D Statistics, https://doi.org/10.1787/strd-data-en.

 StatLink https://doi.org/10.1787/888933941462

Estonia and Belgium both have just under 10 higher education support staff to 100 researchers, below the OECD average for the higher education sector. Proportions of support staff are also well below the average for the government sector, at around 20 researchers per 100 technicians in each of the two jurisdictions. The Netherlands is one of only a few countries with greater proportions of support staff working in the higher education sector (44 per 100 researchers) than in the government sector (24 per 100 researchers). This could partly be explained by the national emphasis on maximising the “valorisation” of research and the additional resources devoted to this priority in the Netherlands (see Chapter 7).

While Norway does not have separate data for technicians and supporting staff, aggregate data for the two categories are available. In Norway, there are around 40 technicians and other supporting staff per 100 researchers. This number is somewhat higher for the government sector, but markedly lower for the higher education sector at only 27. These values are below the OECD average of 51 overall and 69 for the government sector, and relatively in line with the average of 29 for the higher education sector. However, Norway has a very high number of researchers relative to its population. This may indicate that in Norway researchers perform the tasks that are performed by technicians and other supporting staff in other countries, and this may explain the apparent relative under-resourcing in these personnel categories.

6.4. Accessing a career in research

Doctoral education represents the key entry point into a career in academia. Most career paths in higher education research require a doctorate as the minimum standard before researchers can progress to the next career level, for example as a post-doctoral researcher, junior lecturer or associate professor (see Chapter 4).

On a global level, the role of doctoral students and graduates within the broader research system could be considered to be at crossroads. Many countries have been actively encouraging increasing numbers of doctorate holders in the population, and there have been large increases in the numbers of new doctorates worldwide over the last decades (OECD, 2016[3]). However, the increased numbers alone may not be necessarily be meeting the needs of the research and development sector. For example, there have been some indications of a slowdown in STEM doctorate graduates in recent years, particularly in the largest doctoral education systems, which could lead to a future shortage of researchers in these fields. At the same time, in some cases, doctoral graduates are facing uncertain and insecure career paths within public research systems. Many doctoral graduates and increasingly, post-doctoral researchers, are leaving the research profession (OECD, 2016[3]).

Nonetheless, a steady supply of skilled knowledge-based capital will be needed to spur the innovations of the future and maximise the potential for future economic progress (OECD, 2015[2]). Furthermore, to actively participate in international innovation networks, countries will need to not only ensure that they have a pool of capable researchers, but that they have the skills to collaborate effectively across institutions and countries, and that the research they do is relevant to the international market (OECD, 2017[25]).

Therefore, the policy focus is beginning to broaden in many countries from increasing the volume of doctoral graduates to also ensuring rewarding careers in R&D, addressing systemic and individual challenges that can arise throughout a career in research, and helping doctoral graduates to develop the types of transferable skills that are in demand across the economy. This section looks into how doctoral education is organised (with a particular focus on the participating jurisdictions) and the flows of students in and out of doctoral studies. The data presented can give an indication of how successful systemic policies and practices are in attracting doctoral students, and providing rewarding conditions which encourage them to complete their studies and progress.

6.4.1. Entering doctoral studies

Across OECD countries, doctoral education is organised in diverse ways, and there are substantial differences in the number and profiles of those who are pursuing doctoral studies. The entry requirements for a doctorate also vary across OECD countries.

Since the introduction of the three-cycle system as part of the Bologna Process in Europe, a master’s qualification is generally the basis for admission to doctoral studies throughout the European Higher Education Area (EHEA). The duration of doctoral studies within the EHEA is typically three to four years. The Canadian doctoral programme is also similar to European approaches, with most students entering on the basis of a master’s degree, though the average time for completion of the doctorate is around six years.

By contrast, in the United States, the majority of students can enter doctoral programmes following the completion of a bachelor’s degree. However, during the first two years of doctoral programmes, students participate in graduate-level coursework and doctoral seminars and colloquia. Students may then be required to pass a qualifying examination in the second or third year of study to be admitted to the research part of the doctoral programme. Students take between six and nine years to complete a doctorate in the United States depending on the subject and the institution.

In Australia, the usual prerequisite for prospective students is the completion of a bachelor’s programme with an honours component (class I or IIA). Alternatively, students may be accepted on the basis of completion of a master’s through research or course work. Doctoral programmes typically take three to four years to complete.

The most common type of qualification obtained from research doctoral studies is the Doctor of Philosophy (PhD), though professional doctoral education has seen significant growth in many countries. Professional or discipline-specific doctorates are most often obtained by undertaking a combined period of study based at a higher education institution (which can comprise taught programmes, research or both) and professional practice, and are oriented more towards applying the skills obtained in professional practice than a career as a researcher. While some OECD countries, such as the UK and the USA, offer increasing numbers of professional doctoral programs, other countries, such as Canada, have instead opted to add more professionally focused elements to the traditional PhD program (Chiteng Kot and Hendel, 2012[26]).

Accessing and funding doctoral education in the participating jurisdictions

In all of the participating jurisdictions, admission to doctoral studies is generally on the basis of a master’s degree or an equivalent qualification, with a minimum duration of around three years FTE, though typically completion takes at least four years (Table 6.3). Higher education institutions may have additional requirements for admission, such as interviews, the submission of a research plan, additional examinations, etc. In the Flemish Community and the Netherlands, candidates without a master’s degree may be admitted to a doctoral programme, but only in exceptional cases, and applicants may need to undergo a competence assessment to show their ability to conduct research and write a doctoral thesis.

In Estonia, the Flemish Community and the Netherlands, doctoral studies are carried out only in universities. In Norway, the majority of state institutions and some private institutions also provide doctoral education. In the Netherlands, all doctoral candidates are either part of a graduate school or a research school. Research schools are partnerships between multiple research universities and research institutes, while graduate schools are organised within universities.

The level and type of financial supports for doctoral students are important predictor variables for the completion of doctoral education, with assistantship-type support (where a student receives a stipend in return for the performance of specific research or teaching-related duties) strongly associated with increased completion (Ampaw et al., 2012[27]). All four participating jurisdictions have a range of mechanisms in place to provide financial stability for doctoral students.

Table 6.3. Characteristics of doctoral education in the participating jurisdictions

Estonia

The Flemish Community

The Netherlands

Norway

Providers of doctoral education

Universities

Universities

Universities

Universities and some university colleges

Admissions requirements

Master’s degree or equivalent (required); other admission requirements set by institutions may apply

Master’s degree (exceptions apply); other admission requirements set by institutions may apply

Master’s degree (exceptions apply); other admission requirements set by institutions may apply

Master’s degree (at ISCED-7); other admission requirements set by institutions may apply

Duration of doctoral studies

3-4 years FTE (typical duration 4 years)

4 years (intended duration, but on average candidates take about 5 years to complete)

3 years FTE (minimum duration) but most doctoral candidates working at universities are appointed for 4 years

3 years FTE (minimum duration). Doctoral candidates are normally hired based on a 4-year contract (1/4 of the time dedicated to teaching and other duties at the HEI). Candidates financed through other sources are on 3-year contracts

Status of doctoral candidates

Students

Students but in addition they can be considered employees of the university where they study, or of a foundation that provides scholarships for doctoral education

Most doctoral candidates are employees of the university where they study; there are also external doctoral candidates

Employees of the higher education institution where they study, of a company, or a public employer; there are also external doctoral candidates

Source: Adapted from Eurydice (2018[28]), National Education Systems, https://eacea.ec.europa.eu/national-policies/eurydice/home_en; information provided by the participating jurisdictions. See the reader's guide for further information.

In most European countries, including Estonia and the Flemish Community, the primary status of a doctoral candidate is a student status (Eurydice, 2017[29]). In the Flemish Community, students may also be considered employees of the university where they study or of a foundation that provides scholarships for doctoral studies. Around 13% of doctoral candidates in the Flemish Community have both student and employee status (Eurydice, 2017[29]).

In contrast, in the Netherlands and Norway, the primary status of a doctoral candidate is an employee of the educational institution, usually for a period of four years (Eurydice, 2017[29]). This applies to most doctoral candidates in Norway and around half of candidates in the Netherlands. In these jurisdictions, some doctoral candidates are also hired as employees of another public or private employer. In the Netherlands, around 45% of doctoral candidates are considered ‘external candidates’. These individuals generally work outside the academic sector (Eurydice, 2017[29]). A small number of doctorate students can also study on the basis of a scholarship, through a scheme introduced in 2015 to attract more talented students to doctoral education. Many of the students benefiting from this scholarship are international students.

In Estonia, doctoral candidates are classed as students and are entitled to social benefits on the same grounds as bachelor’s and master’s students. However, they are also entitled to some employee benefits such as parental leave and pension credits. In 2012, the position of junior researcher was created to encourage doctoral candidates to continue working in the research field after obtaining a doctoral degree. This means that doctoral students can work in parallel as junior researchers and receive a salary in addition to their study allowance.

There are also funding schemes in the participating jurisdictions that support prospective students employed in other sectors outside of academia. For example, in Norway, public sector organisations and businesses that allow their employees to complete a doctorate in their area of work are entitled to financial support from the Research Council of Norway (Research Council of Norway, 2019[30]).

Entering doctoral studies

Numbers of doctoral students have been increasing in recent years across the OECD, and based on patterns of entry for 2016, 2.4% of young people are expected to enter a doctoral programme or equivalent in their lifetime on average across the OECD. By comparison, lower levels of higher education first-time entry rates equal 58% for bachelor’s programmes and 24% for master’s programmes (OECD, 2018[31]). This overall rate masks substantial inter-country differences, however. Entry rates surpass 4% in Switzerland and the United Kingdom but are less than 0.5% in Chile (Figure 6.14)

Figure 6.14. Entry rates at doctoral level (2016)
Including and excluding international students
Figure 6.14. Entry rates at doctoral level (2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Data on doctoral students exclude those who are employed outside of higher education.

Source: Adapted from OECD (2018[32]), OECD Education Statistics, https://doi.org/10.1787/edu-data-en.

 StatLink https://doi.org/10.1787/888933941481

Doctoral education is characterised by a relatively high level of internationalisation reflecting policy efforts to increase international mobility in the scientific community and among highly skilled individuals (OECD, 2017[13]). On average across the OECD, more than one out of four new entrants to doctoral education is an international student, compared to one out of five at the master’s level and one out of ten at the bachelor’s level (OECD, 2016[33]). Luxembourg had the highest proportion (78%) of international new entrants at the doctoral level among OECD countries in 2016; and around one in two new entrants in New Zealand, Switzerland and the United States were international students in the same year. In some countries, such as Greece and Mexico, international students accounted for less than 5% of all new entrants at the doctoral level (Figure 6.14).

When excluding international students, first-time entry rates at the doctoral level in 2016 decreased from 2.4% to 1.7% on average in OECD countries and by more than half in Switzerland (from 4.7% to 2.0%) and New Zealand (from 3.2% to 1.3%) (Figure 6.14).

Within the participating jurisdictions with available data, Estonia and the Netherlands had entry rates at the doctoral level below the OECD average in 2016 with first-time entry rates of 2% and 1.5% respectively, while Norway was marginally above the OECD average with a first-time entry rate of 2.7%. International entrants represented 43% of new entrants to doctoral education in the Netherlands, which was 14 percentage points above the OECD average. In Norway and Estonia, international entrants accounted for 31% and 19% of new entrants respectively (Figure 6.15).

Figure 6.15. Profile of first-time new entrants to doctoral studies (2016)
Percentage of total
Figure 6.15. Profile of first-time new entrants to doctoral studies (2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Source: Adapted from OECD (2018[32]), OECD Education Statistics, https://doi.org/10.1787/edu-data-en.

 StatLink https://doi.org/10.1787/888933941500

The profile of doctoral candidates

Based on 2016 evidence for OECD countries, students are on average 31 years-old when they first enter a doctoral programme. But the age at which students first start doctoral studies varies across countries. For example, in the Netherlands, students are 26 years old on average when they first start a doctoral programme, whereas in Portugal the average age of entry is 35 years old (OECD, 2018[31]). This could be a function of the age at which students graduate from lower levels of higher education, the flexibility of the higher education system, or cultural expectations (such as a preference for having work experience before entering a doctorate programme).

Overall, approximately 59% of new entrants to doctoral education across the OECD are below the age of 30. While in some countries, such as the Czech Republic and France, more than 75% of new entrants to doctoral programmes are below the age of 30; in others, such as Israel and Portugal, less than 40% of new entrants are below this age (Figure 6.15).

The Netherlands is the country with the largest proportion of younger entrants to doctoral education among OECD countries, with 87% of new entrants to a doctoral programme below the age of 30 in 2016. In Estonia, 67% of new entrants were under 30 in 2016 while less than half (46%) of entrants were under this age in Norway.

While starting ages are different, it is clear that in most OECD countries, doctoral students are most likely to be going through their studies while in their 30s. Insecurity about career prospects and limited financial resources often associated with early-stage careers in research (and in some countries, the accumulation of debt over this period) can be at odds with other sectors which may offer greater job security and benefits for similar levels of skills and experience within the age cohort. This also means that doctoral graduates tend to enter the labour market at a later stage compared to peers choosing other career paths. Furthermore, the employment prospects for doctoral graduates can vary; while overall unemployment rates for doctoral graduates are very low, the higher education sector appears to only absorb about one-third of doctoral graduates, which may mean that many young researchers are not able to follow their preference for an academic career (Section 6.5).

Figure 6.15 also shows the share of female new entrants to research careers, based on 2016 data. On average, close to 49% of new entrants to doctoral education in OECD countries were women in 2016, reflecting the progress that has been made in this area in recent years in closing the gender gap in higher education enrolments at all levels. The lowest proportions of women entering doctoral programmes were in Japan (about 30%), Chile, Korea, Luxembourg and Turkey (around 40%), while the proportion was more than 50% in a group of countries including Finland, Iceland and Poland. However, other sorts of gender gaps remain in research (see Box 6.2).

Women accounted for around 50% of the population of new entrants to doctoral education in the Netherlands, and Norway in 2016, which is just above the OECD average. In Estonia, over 52% of new entrants to doctoral education were women.

6.4.2. Completion of doctoral programmes

Doctorates are awarded following the achievement of a set of requirements which aim to show the standard has been met to achieve the award. Doctoral degrees can be awarded based on the public defence of a thesis, by publishing a minimum amount of material, or by other means, such as completing a combined programme of teaching and research, or other practice-related milestones in the case of professional doctorates. Though differences in assessment exist across countries, most processes in European countries, including the participating jurisdictions, entail the preparation of a substantive body of research work and a subsequent defence of the work before an academic committee (Box 6.3).

Box 6.3. Assessment practices for awarding a doctoral degree

In Estonia, doctoral studies are carried out on the basis of an individual work plan, the progress of which is periodically assessed by an attestation committee. Participation in international scientific conferences, international doctoral courses, study activities organised by doctoral schools, and training in laboratories abroad may count towards the fulfilment of such work plan (Eurydice, 2016[34]). Independent research in the form of a thesis, a series of publications accompanied by a summary article or a published monograph can be recognised as a doctoral thesis. The degree of ‘doctor’ is awarded after the completion and public defence of the thesis.

In the Flemish Community, the degree of ‘doctor’ is awarded after a period of scientific research and the public defence of a doctoral thesis involving a university panel of academics. At most universities, the doctoral fellows have followed training organised by doctoral schools before defending the doctoral thesis (Eurydice, 2014[35]).

In the Netherlands, the progress of a doctoral candidate is evaluated on an individual basis, usually through an arrangement made between the candidate and the supervisor. The status of the supervisor remains provisional until their official appointment shortly before the doctoral defence. The doctoral dissertation of the candidate is first approved by the supervisor and then provided to a panel of at least three academics to decide whether the dissertation satisfies the standard required for a doctorate (Eurydice, 2014[35]).

In Norway, at least three senior academics sit on the committee that evaluates a candidate’s doctoral thesis, and at least one of them must come from another institution in Norway or from abroad (Eurydice, 2011[36]). The doctoral degree is awarded after a public thesis defence. The traditional doctorate leads to a degree of ‘doctor of philosophy’, which must be based on high level research.

Another major model of doctoral assessment is in place in the United States, where it is common for doctoral candidates to receive more formative assessment throughout the process and first defend their progress in front of a committee, then only prepare the dissertation after this successful examination (Barnett et al., 2017[37]).

Expected graduation rates from doctoral education can give an indication of the relative success of OECD countries in producing young research talent. Based on patterns of graduation for 2016, approximately 1.8% of young people across the OECD are expected to graduate from a doctoral programme in their lifetime, compared to 18% who are expected to graduate with a master’s degree and 38% with a bachelor’s degree (OECD, 2018[31]).

In 2016, first-time graduation rates at the doctoral level exceeded 3% in only three countries: Denmark, Switzerland, and the United Kingdom (Figure 6.16). These countries also have some of the highest first-time entry rates and the largest share of international students in doctoral education in the OECD. When excluding international students, the first-time graduation rate for OECD countries dropped to 1.2%. Across the OECD, around 30% of students who graduated from a doctoral programme in 2016 were international students, compared to 19% who received a master’s degree, or 7% who were awarded a bachelor’s degree for the first time (OECD, 2017[38]).

Figure 6.16. Graduation rates at doctoral level (2016)
Including and excluding mobile students
Figure 6.16. Graduation rates at doctoral level (2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Source: Adapted from OECD (2018[32]), OECD Education Statistics, https://doi.org/10.1787/edu-data-en.

 StatLink https://doi.org/10.1787/888933941519

Among the participating jurisdictions, first-time graduation rates exceed the OECD average in the Netherlands, where 2.4% of young people are expected to graduate at the doctoral level. Norway is just below the average and Estonia and Belgium fall below the average with 1.3% and 0.6% first-time graduation rates at the doctoral level, respectively. When excluding international students, first-time graduation rates drop by as much as 50% in Belgium (from 0.6% to 0.3%) and by 40% in the Netherlands (from 2.4% to 1.4%).

In the Netherlands, graduation rates are considerably higher than entry rates for all students, excluding mobile students. This may reflect the fact that doctoral researchers do not register initially as doctoral students and are thus excluded from the entry rates statistics. It would also explain why entry rates in the Netherlands are well below the OECD average, whereas graduation rates are well above the OECD average for all students and in line with the average when excluding mobile students.

Comparing the rates in Figure 6.14 and Figure 6.16 may suggest that entry rates are growing, but also could indicate that many candidates do not complete doctoral education. Internationally comparable data on completion rates in doctoral programs is not currently available, but evidence from individual country studies indicates that they are relatively low across the OECD. Non-completion rates have been estimated to be as high as 50% in many countries (Van Der Haert et al., 2013[39]). This represents a cost for both individuals and higher education systems as a whole. Non-completers may experience lower employment prospects and a decrease in self-esteem, while systemically there is a loss in terms of financial resources, human resources and the loss of potential from research that will not be completed (Litalien and Guay, 2015[40]).

While there are limited studies on those who drop out of doctoral education, emerging evidence indicates that a number of personal and institutional factors can play a role in the decision to leave doctoral education. In a recent study, for example, more than one-third of doctoral students reported their intention to drop out, based on a range of factors including the difficulty of balancing doctoral studies and personal life, and problems with isolation and a lack of integration into their local academic community (Castelló et al., 2017[41]).

Evidence also suggests that doctoral completion rates can be improved through specific institutional practices, for example through ensuring academic staff are well prepared to supervise doctoral students (Box 6.4). Encouraging these practices can help to reduce costs related to non-completion.

Box 6.4. Social support and doctoral completion

Many factors play a part in doctoral non-completion. While adequate financial support is important, social support also plays a key role in improving the experience of doctoral candidates and improving completion rates. The role and approach of the doctoral supervisor is particularly vital in this regard. Professional and emotional support from an engaged doctoral advisor can help the doctoral candidate perceive stressful parts of doctoral education as less stressful (for example, writing the doctoral dissertation). Doctoral candidates are also more likely to progress in their professional development if they have a supervisor that is well connected to the relevant professional networks and wider group of scholars in the field of expertise, and when the supervisor and other faculty allocate time towards organising opportunities to discuss research questions and improve their scholarship (Jairam and Kahl, 2012[42]).

Some OECD countries are using funding mechanisms to encourage higher education institutions to increase the number of students graduating with doctoral degrees. For example, Estonia, the Flemish Community, the Netherlands and Norway take into consideration the number of defended doctoral degrees when allocating R&D funding to institutions. Estonia has also set a target to increase the number of new doctoral graduates in an academic year to 300 by 2020 (Estonian Ministry of Education and Research, 2014[12]). This figure amounted to 190 in 2012, and had increased to 253 by 2017.

6.5. Profile of doctorate holders in the population

As the numbers of individuals with advanced research qualifications expands, it is becoming increasingly possible to identify them as a separate group and provide more detailed information on their profiles and labour market outcomes. The outcomes of doctorate holders is of particular policy interest, given the substantial government investment in doctoral education by many national research systems.

On average across OECD countries, 1.1% of the population aged 25-64 had completed a doctoral level programme in 2017 (Figure 6.17). However, the share of doctoral holders in the population varied substantially among OECD countries, from less than 0.5% in Latvia, Mexico and Turkey to 2% or more in Luxembourg, Slovenia and Switzerland. In the participating jurisdictions, doctorate holders accounted for 1.1% of the population in Norway, similar to the OECD average, while they represented less than 0.6% of the population in Estonia, Flanders and the Netherlands.

Figure 6.17. Share of doctoral holders in the population (2017)
25-64 year-olds
Figure 6.17. Share of doctoral holders in the population (2017)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Source: Adapted from OECD (2018[32]), OECD Education Statistics, https://doi.org/10.1787/edu-data-en.

 StatLink https://doi.org/10.1787/888933941538

6.5.1. Careers of doctorate holders

The UNESCO/OECD/Eurostat data collection on the Careers of Doctorate Holders (CDH) was initiated in 2011 in order to improve the information available about the profile and career patterns of doctorate holders in the population, given their importance in national research systems. Data are collected every two years at the aggregate level from OECD member countries, which provide the aggregates based on a range of national data sources, including labour force surveys and population registers (OECD, 2013[43]).

The 2016 version of the data collection covered 16 OECD countries, and Flanders. The CDH data shows that doctorate holders are more likely to move across borders than many other categories of the population. On average across OECD countries with available data, doctorate holders who are foreign-born accounted for nearly one-quarter of doctorate holders in 2016 (Figure 6.18, Panel A). In addition, 14% of doctorate holders were foreign citizens in 2016, on average across OECD countries.

In Norway, foreign-born doctorate holders made up 45% of the total doctorate holders in the population, the third largest share among OECD countries with available data. Norway also had the second highest share of foreign citizen doctorate holders among OECD countries with available data (37%), indicating that Norway is an attractive destination for talent with advanced qualifications.

In Flanders, the share of foreign-born doctorate holders was slightly above the average across OECD countries, with 25% of doctorate holders being foreign-born. On the other hand in Estonia and the Netherlands, the share of foreign-born doctorate holders was below the average, at 16% and 14% respectively. Similarly, the share of foreign citizen doctorate holders was above the average in Flanders (16%), while it was below the average in Estonia (9%) and the Netherlands (6%).

Doctorate holders are more likely to be foreign-born or a foreign citizen than master’s holders (Figure 6.18, Panel B). The shares of foreign-born individuals and foreign citizens were 4 percentage points higher among doctorate holders than master’s holders, on average across OECD countries in 2016. However, this pattern does not hold equally across countries. For example, while in Flanders and Norway, the shares of foreign-born individuals and foreign citizens among doctorate holders were around double the share of master’s holders, the shares of foreign citizens among doctorate holders were lower in Estonia and the same for both masters and doctorate holders the Netherlands.

In comparison with the general trends for fields of study among the population with higher education as a whole, doctorate holders are less likely to specialise in education; arts and humanities; social sciences; and business administration and law. On average across OECD countries with available data, over half of master’s holders studied these subjects, compared to one-third of doctorate holders. Less than 20% of doctorate holders completed their doctoral study in the field of health and welfare; while around 11% studied in the fields of arts and humanities, engineering and social sciences respectively (Figure 6.19).

On the other hand, more than one-quarter of doctorate holders in OECD countries with available data studied natural sciences. This is a much higher proportion than the overall proportion of graduates from natural sciences programmes, where on average across the OECD, less than 7% of graduates earned a qualification in natural sciences in 2015 (OECD, 2018[32]). This highlights the prominent role that doctoral education plays within economies to provide the advanced STEM qualifications required in many areas of the labour market.

Differences in emphasis on various fields of study are also evident across the four participating jurisdictions. In the Flanders, a relatively large share of doctorate holders specialised in engineering (18% compared to the OECD average of 11%). In the Netherlands, doctorate holders who studied social sciences accounted for 17% of the total cohort, higher than the OECD average of 11%, while in Norway, 16% of doctorate holders studied arts and humanities, which is above the OECD average (also 11%).

Figure 6.18. Advanced degree holders by country of birth and citizenship (2016)
Figure 6.18. Advanced degree holders by country of birth and citizenship (2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Chile, Latvia and the United States: Data refer to 2015. Finland: Data refer to 2014. The Netherlands: Data refer to 2013.

Source: OECD Careers of Doctorate Holders survey.

 StatLink https://doi.org/10.1787/888933941557

Figure 6.19. Doctorate holders by field of study (2016)
Figure 6.19. Doctorate holders by field of study (2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Countries and economies are ranked in descending order of the share of new entrants enrolled in: education; arts and humanities; social sciences; and business administration and law. Chile, Latvia and the United States: Data refer to 2015. Finland: Data refer to 2014. The Netherlands: Data refer to 2013.

Source: OECD Careers of Doctorate Holders survey.

 StatLink https://doi.org/10.1787/888933941576

Around 35% of doctorate holders were employed in the education sector in 2016, on average across OECD countries with available data (Figure 6.20). In Estonia and Flanders, the shares of doctorate holders working in the education sector were above the average level, while the share was below the average in the Netherlands. The substantial share of doctorate holders working outside of the education sector may suggest that there is a strong demand for the skills and knowledge provided by doctoral education in the wider labour market, especially given the tendency for doctorate holders to qualify in higher numbers in fields that are in high demand in the labour market. However, the relatively low rate of absorption into the education sector may also be indicative of a shortage of jobs, particularly in academia.

Figure 6.20. Doctorate holders by industry of employment (2016)
Figure 6.20. Doctorate holders by industry of employment (2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Latvia: Data refer to 2015. Finland: Data refer to 2014. The Netherlands: Data refer to 2013.

Source: OECD Careers of Doctorate Holders survey.

 StatLink https://doi.org/10.1787/888933941595

6.6. Internationalisation of research

6.6.1. International mobility

International mobility in R&D is important because it facilitates the circulation of knowledge and affects the quality of research. International mobility is also crucial to the innovation process; increasingly it is recognised that international collaboration, including the mobility of students and researchers, is likely to yield better results for innovation processes than continuously intensifying a “race for talent and investment” (OECD, 2017[25]).

International mobility is characterised in some OECD countries as a “brain circulation” where countries experience both inflows and outflows of talent. One measure of brain circulation is to examine the net flows of scientific authors, using bibliometric data available from the Scopus database, which provides data on the location of the affiliations of scientific authors over the time of their publications. These data therefore give an indication of those who move to another country or economy, those who stay in the same location, and those who return to the economy in which they first published (Figure 6.21 and Figure 6.22).2 According to the Scopus data, researchers who conduct research abroad and return to the economy in which they first published contribute to raising the overall impact3 of domestic research by 20% on average (OECD, 2017[13]).

Net flows of research authors for the OECD as whole since 2002 appear to be negative according to the Scopus data; over the period 2002-2016 in total there was a net outflow of almost 14 000 researchers (OECD, 2017[13]). Relative to the size of the population of 25-64 year-olds, Luxembourg, Switzerland, Chile, Iceland, and Norway have the largest positive net flows of researchers, while Italy and Greece, have the largest negative relative flows (Figure 6.21). In the participating jurisdictions, both Norway and Estonia experienced a net brain gain over the period, though the gain for Norway was over double the gain for Estonia. At the same time, between 2002 and 2016 Belgium and the Netherlands experienced close to even flows overall relative to the population.

In general, individual researchers who move to other countries are more likely to be associated with higher impact publications than researchers who have stayed in their original countries or returned. This appears to be mostly the case when moving from lower to higher performing research systems. For example, in the United States, researchers who leave the country tend to have lower journal scores, while those who move to the United States have higher scores than those who have stayed there, providing an indication that this country is very attractive for talented researchers (OECD, 2017[13]).

Figure 6.21. International net flows of scientific authors, selected economies (2002-2016)
Difference between annual fractional inflows and outflows per 100 FTE researchers
Figure 6.21. International net flows of scientific authors, selected economies (2002-2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

OECD calculations based on Scopus Custom Data, Elsevier, Version 4.2017, July 2017.

Source: Adapted from OECD (2017[13]), OECD Science, Technology and Industry Scoreboard 2017: The digital transformation, https://doi.org/10.1787/9789264268821-en.

 StatLink https://doi.org/10.1787/888933941614

In recent years, OECD countries have made substantial efforts to attract international doctoral students and more established researchers to help enhance their research performance. Most recently in the participating jurisdictions:

  • Estonia established the Dora Plus and Mobilitas Plus programmes with support from European regional development funds to attract students and researchers from abroad, improve Estonia’s reputation as a destination for research and expand transnational collaboration opportunities. Among other supports, the Dora programmes provides scholarships for international students for study visits to Estonia and supports to higher education institutions in Estonia to organise short-term courses for international study groups. Initiatives under the Mobilitas Plus include post-doctoral research grants for researchers coming from abroad, and retuning researcher grants for researchers returning to Estonia after completing some research abroad. The programme will continue until 2023.

  • The Flemish Community has established several programmes to attract talented researchers from abroad and to promote outgoing mobility. For example, the Odysseus programme supports researchers from abroad who are already considered to be leading in their field, including promising post-docs, to start a research group in a Flemish university. These individuals are offered a permanent position at a Flemish university and project funding to establish a research team.

  • Similarly, higher education institutions in the Netherlands encourage incoming and outgoing mobility of researchers and have designated funds to support such initiatives. Some research universities set aside annual funds for the recruitment of talented foreign research fellows and visiting professors. The Academy of Sciences and the Research Council also provide funding to stimulate international mobility among researchers.

Despite the increasing policy focus and an expansion of initiatives of recent years, it appears from bibliometric analysis that, in any one year, the vast majority of researchers are not internationally mobile (Figure 6.22). In 2016, on average across the OECD, 94% of scientific authors were classed as “stayers” meaning that their 2016 affiliations and pre-2016 affiliations were based in the same country (OECD, 2017[13]). However, mobility patterns and the extent of brain circulation tend to vary across economies. For example, in Greece, Hungary, Spain and the Slovak Republic, among others, the majority of inflows are returnees originally affiliated with an institution in the country. However, in most countries, the majority of researchers with an international mobility record represented new inflows (Figure 6.22).

Figure 6.22. International mobility of scientific authors (2016)
As a percentage of scientific authors, by last main recorded affiliation in 2016
Figure 6.22. International mobility of scientific authors (2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

OECD calculations based on Scopus Custom Data, Elsevier, Version 4.2017, July 2017.

Source: Adapted from OECD (2017[13]), OECD Science, Technology and Industry Scoreboard 2017: The digital transformation, https://doi.org/10.1787/9789264268821-en.

 StatLink https://doi.org/10.1787/888933941633

Differences in flow patterns can also be observed across the participating jurisdictions. Belgium has one of the largest rates of brain circulation among OECD countries, with new inflows and returnees combined accounting for 8% of all scientific authors in 2016, while outflows were also of the order of 8%. Norway had a slightly positive overall inflow (+1.7%), though overall flow rates were lower than in Belgium. In Estonia and the Netherlands, there was less than one percentage point difference between inflow and outflow rates in 2016.

6.6.2. International collaboration

Along with mobility of talent, levels of international collaboration indicate the ability of research systems to participate in global research and innovation networks. On average across OECD countries, just under 30% of domestically authored documents involved some collaboration with researchers in other countries in 2015 (Figure 6.23). The share of publications with international collaboration was more than 50% in Iceland, Luxembourg, both relatively small countries where the need to collaborate internationally in research might be stronger given the lower likelihood of national networks of specialists within particular fields.

At the other end of the scale, less than 15% of publications in Japan, Korea, Poland and Turkey involve international collaboration, and international collaboration is also below 20% in the United States. The lower rate in the United States may be explained by the relatively advanced scientific network, which provides enhanced possibilities for national collaboration.

Figure 6.23. International scientific collaboration (2015)
As a percentage of domestically authored documents, fractional counts
Figure 6.23. International scientific collaboration (2015)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

OECD calculations based on Scopus Custom Data, Elsevier, Version 4.2017, July 2017.

Source: Adapted from OECD (2017[13]), OECD Science, Technology and Industry Scoreboard 2017: The digital transformation, https://doi.org/10.1787/9789264268821-en.

 StatLink https://doi.org/10.1787/888933941652

Language may also create barriers to international collaboration. While English has been adopted as the common international language for scientific publications, the majority of scientists globally are not native English speakers, and there are differences between countries in the proportions of scientific publications that are published in English. This can cause problems both in terms of transferring knowledge and discovering potential collaborators in the field (Meneghini and Packer, 2007[44]).

In the participating jurisdictions, Belgium, the Netherlands and Norway all had higher shares of international collaboration in publications than the average in 2015, while the share in Estonia was just below the average. The share of publications involving international collaboration was particularly high in Belgium, where almost 40% of all scientific publications in 2015 involved some form of international collaboration.

The above-average level of international collaboration in the Netherlands may be explained by an active involvement of higher education institutions in international alliances and consortia, such as the League of European Research Universities, the European Consortium of Innovative Universities and the IDEA League. Many universities are also active members of research consortia funded by the European Commission. Moreover, under the SEO (Stimulering Europees Onderzoek) scheme, the Netherlands Organisation for Scientific Research (NWO) provides additional funding to universities based on the number of international research projects funded through the Horizon 2020 programme.

In Norway, institutions can also benefit from additional government funding if they receive grants from European interregional co-operation initiatives. Norway’s long-term strategy outlines objectives and priorities for research co-operation in the European Research Area and the Horizon 2020 programme (OECD, 2016[45]). To achieve this goal, the Research Council of Norway increased the budget to support the participation of public research organisations in the EU Framework Programme to NOK 140 million in 2015 (OECD, 2016[45]). Norway additionally has a number of policies to develop international relationships, which can benefit the higher education R&D sector, such as

  • international co-supervision of doctoral candidates with a co-operating institution abroad (cotutelle)

  • the INTPART and UTFORSK initiatives, managed by the Research Council of Norway and the Norwegian Agency for International Co-operation and Quality Enhancement in Higher Education, funds research partnerships and project co-operation with institutions in a number of countries (including Brazil, China, India, Russia, South Africa and the United States).

Estonia has set targets to strengthen international co-operation in research. It aims to increase the share of national public funding for internationally co-ordinated research to 3% of government budget appropriations or outlays for R&D (GBAORD) by 2020 (Estonian Ministry of Education and Research, 2014[12]), from a level that was at 1.3% in 2010. Estonia is also a member of or participant in various international research infrastructures and organisations specialising in health, technology, life sciences and related fields, such as the European Space Agency, European Molecular Biology Conference (EMBC) and the European Organization for Nuclear Research (CERN).

6.7. Measuring and improving research performance

As research activity and investment increases, so does the imperative to measure its impact and evaluate its performance. This is necessary particularly in the case of public research, where there is a renewed focus on accountability for public spending and an increasing requirement for knowledge and evidence on which to base future funding decisions.

Recent OECD work has highlighted the general challenges faced across OECD countries to evaluate the outputs of research and development. The available metrics and approaches for measuring the social and economic impact of R&D suffer from a number of limitations, even as international rankings grow in importance. In addition, the links between the evaluation of research and policymaking are not always clear, including the setting of priorities for the system (OECD, 2016[46]). Developing new and robust ways to measure research performance and set systemic priorities are therefore likely to be areas of continued policy focus into the future. National initiatives are in place in many countries that aim to evaluate and improve the quality and relevance of research, including in the four participating jurisdictions.

Estonia has had a policy monitoring programme for research, development and innovation in place since 2011, coordinated by the University of Tartu. The programme was revised in 2015 to strengthen co-operation between government, higher education institutions and the private sector; and to enhance the role of science and research in the economy (OECD, 2016[47]). The new programme, RITA, examines the implementation of research, development and innovation strategies in co-operation with Tallinn University, the University of Tartu, Tallinn University of Technology, Estonian Academy of Sciences and Estonian Research Council.

In order to monitor progress in the policy objective of alignment of R&D activities with the interests of the Estonian society and economy (Estonian Ministry of Education and Research, 2014[12]), the government introduced two indicators for 2020, one measuring government budget appropriations by socio-economic objectives and the other for the share of public sector R&D expenditure financed by the private sector.

In addition, in 2014, the government allocated EUR 123 million to support institutional development plans and structural reforms, including mergers of higher education and R&D organisations, and to improve the quality of research (OECD, 2016[47]). New measures to strengthen public sector innovation and to improve the capacity of higher education institutions and public research organisations to undertake socially relevant research have also been implemented (Kattel and Stamenov, 2017[4]).

The Flemish Community has also adopted measures to increase efficiency in R&D. A number of research and innovation agencies have been merged, and funding for R&D has been reformed to streamline different research activities and simplify the application process for research funding. Strengthening of policy evaluation capacity has also been a priority, both at the federal level and within individual communities. The Flemish Community, for example, has recently performed an evaluation of the application procedures for projects and grants of the Research Foundation (OECD, 2016[48]).

In the Netherlands, measurement and improvement of research performance takes place within large research programmes, while measurement as such is also part of national monitors of R&D activities. The National Research Agenda (NWA) was developed in a bottom up process with researchers, the private sector, NGOs, citizens and other stakeholders. Research questions were grouped into 25 ‘routes’ that combine scientific and societal challenges (Dutch Ministry of Education, Culture and Science, 2019[9]). The measurement framework of the NWA includes parameters about collaboration between different types of actors (universities, applied research (TO2) institutes, the private sector, NGOs, government agencies, etc.). In terms of output and impact, established indicators such as publications and IPR are used alongside qualitative indicators for knowledge sharing and addressing societal challenges.

Measuring and improving research performance is also addressed in the “top sectors” initiative (see Chapter 7) and its evolution to a mission-driven innovation policy. This initiative seeks to tailor public resources to priority sectors of the economy and to strengthen coordination of activities in these sectors by government, business and knowledge institutions (OECD, 2016[49]). Every two years, the Dutch Statistical Office evaluates the progress of the “top sectors” initiative in the areas of macro-economy, enterprise development, employment characteristics, innovation performance and education output (OECD, 2017[50]). In addition, Statistics Netherlands, the Rathenau Institute and the Association of Universities in the Netherlands (VSNU) monitor investments, activities and results in R&D and innovation.

Norway has adopted a number of reforms to increase the effectiveness and efficiency of public research. This has been reflected through structural reforms involving several mergers of higher education institutions; and funding reforms, including revisions to the indicators considered in the block grant for higher education institutions, and an experiment involving performance contracts (OECD, 2016[45]). The Long-term Plan for Research and Higher Education 2019–2028 serves as the key guiding policy framework for higher education and R&D in Norway. It outlines five priority areas which reflect a mixture of social and economic goals: oceans; climate, environment and clean energy; public sector innovation for better and more efficient services; enabling and industrial technologies; civic protection and social cohesion in a globalised world research (Norwegian Ministry of Education and Research, 2018[7]). In 2016, Norway also introduced stricter requirements for institutional accreditation in order to improve the quality of research and education in higher education institutions (OECD, 2016[45]). Among other factors, these requirements consider the relevance of research to the regional business community and the nature and size of doctoral provision (OECD, 2016[45]).

6.7.1. Monitoring research productivity and quality

In tandem with the increase in the volume of research activity and growing investment in research, there has been an expansion of measures which aim to provide an indication of research and development performance and impact. Pressure at the political level to demonstrate the effectiveness of public spending, the growth of bibliometric analysis and increasing volumes of both quantitative and qualitative information about research output has led to a research-related “metric tide” (Wilsdon et al., 2015[51]). These metrics can relate to the output of individual researchers, or can be aggregated to provide measures of quality and performance for journals, institutions and national systems (Box 6.5).

Box 6.5. Key terms related to research productivity and quality
Most measures of research quality and productivity are based on bibliometrics, such as the number of scientific publications and number of citations (the number of times an individual published paper is referenced in the work of other scientific authors). Key relevant bibliometrics which have grown in popularity and use in recent years include:

Citation count: The number of times a paper has been cited in other publications.

H Index: Designed to measure both productivity and quality at the individual level, the H index is defined as the highest number of publications an author has that have been cited at least an equal number of times (Hirsch, 2005[52]). For example, an H Index of 10 implies that the author has 10 papers that have been cited at least 10 times.

Impact factor: The impact factor measures how often on average each article in a journal is cited in a given year (Glänzel and Moed, 2002[53]). High-impact journals can be defined as those that have the highest levels of citations within their particular journal category or specialty (Garfield, 2003[54]).

Scientific production (of a country): The total amount of publications by authors affiliated with institutions in that country in a given year (OECD and SCImago Research Group, 2016[55]).

Altmetrics: Alternative measures of impact, such as the number of times a publication is mentioned on social media, discussed in blogs or mentioned in news sites.

Quantitative measures of research productivity and quality are still recognised as being experimental in nature and questions remain about how well such measures are able to fully cover research activity, given that there is no one central repository of all scientific publications, and there are variations in methodologies between different repositories of indexed scientific publications on how such metrics are calculated (OECD, 2017[13]). However, because of the volume of information available, they have become widely adopted as the best available measures of research performance.

Despite the increasing access to metric performance data, qualitative evaluation through peer review remains the backbone of quality assurance in scientific production, both for reviewing individual research outputs and determining which research project proposals should be funded. Peer review of research proposals can help to increase the probability of the highest quality research being supported financially. However, the peer review process for journal publications has also attracted criticism due to the delays it introduces in communicating scientific results; and as evidence emerges demonstrating various types of bias, a lack of reliability and predictability in review processes (Bornmann, 2013[56]). While no alternative has arisen to challenge peer review, it is likely that future measures of research performance will increasingly attempt to combine both qualitative and quantitative elements, to provide a more multidimensional view of performance and increase confidence in the process (OECD, 2016[46]).

However, while peer review and bibliometric data can give some information on aspects of quality, there are other quality issues related to research publications for which solutions must be found in the research community. A major quality challenge relates to reproducibility of research; an increasing number of studies across various fields show that a large proportion of research claims and results cannot be replicated either by the original researchers or another team (Ioannidis, 2017[57]). Various obstacles to reproducibility present themselves at all stages of the research process, including not controlling for bias at the design stage, p-hacking (generating hypotheses and making analytical decisions which fit the structure of the observed data), failing to properly outline the experimental conditions under which the results were obtained and results which meet the standard of being statistically significant but with small effect sizes (Munafò et al., 2017[58]).

A number of initiatives aim to improve the ability to replicate important research results and strengthen the knowledge base which is used to underpin many decision processes and inform further research. For example, in some fields such as medicine, pre-registration of studies and specification of their protocols in advance of conducting the research have become standardised (Munafò et al., 2017[58]) and many high-impact journals have introduced more stringent requirements for authors to describe the conditions under which experiments were carried out (McNutt, 2014[59]).

Other policy actions which can improve the reliability of research include open science movements such as the European Commission’s European Open Science Cloud, which has a goal of ensuring that all scientific publications are FAIR (Free, Accessible, Interoperable and Reusable). One of the key drivers of the requirement for FAIRness is the recognised need for research to be more reproducible, and evidence suggesting that implementing FAIR principles systemically is likely to bring considerable return on investment in terms of research quality, transparency and discoverability (European Commission, 2018[60]). Governments can also play a role in improving research quality, for example by funding research which aims to replicate existing results and requiring pre-registration of study hypotheses as a condition for awarding funding (KNAW, 2018[61]).

6.7.2. Volume and impact of research output

Metrics used for assessing the performance of research in higher education at the systemic level include the volume of output, measured quantity of scholarly output per FTE researcher; and the impact of output, often measured by citation counts per FTE researcher. These values are often normalised by fields of study, due to the differences in the levels of citations between different fields. Another measure used to assess quality of research is the number of scholarly output per FTE in high-impact journals, i.e. those journals whose publications traditionally attract more citations from the scientific community (Box 6.5).

Figure 6.24 presents some information on the overall quantity and impact of scientific production in different economies, by measuring the volume of scientific publications and the relative numbers of citations they attract.

In terms of volume of publications, the most productive countries in 2015 with around 5 publications per 1 000 25-64 year-olds in the population were Australia, Denmark and Switzerland. On the other hand, Chile, Mexico and Turkey had the lowest volume of publications, at less than one publication per 1 000 of population.

Norway and the Netherlands produced publications at a level higher than the OECD average in 2015, with around 4 publications per 1 000 of 25-64 year-olds, compared to the OECD average level of 3 publications. In the same year, Belgium produced 3 publications and Estonia 2.5 publications respectively for every 1 000 25-64 year-olds.

The percentage of documents from each country in the global 10% most-cited publications allows a comparison of the scientific impact of publications at the system level, as a proxy for the quality of output of research systems. In 2015 Switzerland had the largest share of domestic scientific documents within the top 10% most-cited publication (15%), closely followed by the Netherlands and Luxembourg. On the other hand, only about 4% of publications in Lithuania, Mexico and Turkey appeared among the world’s most-cited publications (Figure 6.24).

Figure 6.24. Quantity and impact of scientific production (2015)
Number of documents and percentage among the world’s 10% most cited publications, fractional counts
Figure 6.24. Quantity and impact of scientific production (2015)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

OECD calculations based on Scopus Custom Data, Elsevier, Version 4.2017; and 2015 SCImago Journal Rank from the Scopus journal title list (accessed June 2017), July 2017.

Source: Adapted from OECD (2017[13]), OECD Science, Technology and Industry Scoreboard 2017: The digital transformation, https://doi.org/10.1787/9789264268821-en.

 StatLink https://doi.org/10.1787/888933941671

Belgium also performs highly according to this measure, with around 13% of publications among the most cited globally, higher than the OECD average level of just under 10%. There are no disaggregated statistics for the regions of Belgium, but the normalised score for most-cited publications from the European Regional Innovation Scoreboard shows the highest performance for Flanders (0.77), followed by the Brussels Region (0.72) and Wallonia (0.69) (European Commission, 2017[62]). Norway (11%) and Estonia (10%) both have levels of top cited publications slightly higher than the OECD average, and Estonia in particular has shown a considerable improvement in this indicator from 2005 to 2015 (OECD, 2017[13]).

The number of top-cited publications has been used widely as a proxy measure of the quality of research output, though it may be more accurately considered as a measure of its impact, as certain papers such as broad reviews of literature tend to attract more citations regardless of quality, certain fields of study tend to have higher citation counts, and authors may also cite a paper when criticising it (Tahamtan, Safipour Afshar and Ahamdzadeh, 2016[63]). Despite some shortcomings in the measurement process, the use and acceptance of bibliometric data to measure performance is growing across the OECD. In many countries, such the participating jurisdictions, they are now part of the decision-making process for R&D funding (Box 6.6).

Box 6.6. Connecting R&D funding to bibliometric data

To improve the quantity and quality of their scientific output, participating jurisdictions have incorporated bibliometric information into R&D funding decisions.

In Estonia, around one-third of base funding is based on the number of publications in internationally recognised journals, the number of high level research monographs and the number of registered patents and patent applications (Jonkers and Zacharewicz, 2016[18]). The remainder of the funding in based on qualitative evaluations.

In the Flemish Community, around 40% of the ‘Special Research Funds’ provided to Flemish universities are based on research output and scientific impact (Jonkers and Zacharewicz, 2016[18]). Among the bibliometric information considered when allocating funding are publications in the Web of Science (WoS), a repository of academic articles, and citations and publications in the Flemish Academic Database for the Social Sciences and Humanities (VVAB). The latter was created in response to the low representation of social sciences and humanities journals in the WoS (Jonkers and Zacharewicz, 2016[18]). Inspired by the Norwegian funding model for research, the Flemish Government modified the bibliometric part of the funding model in 2008 to give prominence to all areas of research and make field-specific publications comparable across fields. Publications in the VVAB were included in the funding model in 2010, and their relative weight has increased since 2012.

Norway introduced incentives for publications in the higher education funding model in 2004. The funding model for research was designed in a way that offers a complete representation of verifiable bibliographical records in all areas of research and makes field-specific output comparable across research fields (Sivertsen, 2016[64]). Comprehensive bibliometric information is verified or provided by research organisations, through an integrated national research information system (CRISTIN), covering all public research organisations in Norway, including universities, university colleges, university hospitals and independent research institutes. Higher weight is given to publications in the most selective international journals and book publishers. Evidence suggests that this has not led to higher citation impact at the country level, but it did increase the absolute number of publications in high-level publication channels (Sivertsen, 2016[64]).

The Netherlands uses a Standard Evaluation Protocol (SEP) to monitor the quality of research. The SEP is periodically evaluated by the association of universities, the Research Council and the Royal Academy of Arts and Sciences. The SEP planned for 2015-2021 has moved from a high emphasis on research output to research quality. All research universities and research institutes are subject to assessment according to the guidelines outlined in the SEP. In 2014, the Netherlands released a White Paper announcing its vision for science and research for 2025. It envisages conducting world-class research, maximising research impact through stronger links to industry and society, and developing talent (OECD, 2016[49]).

Figure 6.25. The citation impact of scientific production and the extent of international collaboration (2012-2016)
As an index and percentage of all citable documents, based on fractional counts
Figure 6.25. The citation impact of scientific production and the extent of international collaboration (2012-2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

The size of the bubble indicates the relative volume of publications (using fractional counts). The normalised citation impact measure is derived as the ratio between the average number of citations received by documents published by authors affiliated with an institution in a given economy and the world average of citations, over the same time period, by document type and subject area.

OECD calculations based on Scopus Custom Data, Elsevier, Version 4.2017, July 2017.

Source: Adapted from OECD (2017[13]), OECD Science, Technology and Industry Scoreboard 2017: The digital transformation, https://doi.org/10.1787/9789264268821-en.

 StatLink https://doi.org/10.1787/888933941690

When comparing the data in Figure 6.22 and Figure 6.23 on international mobility and collaboration of researchers and Figure 6.24, a link between internationalisation and research performance could be inferred. The countries that perform the best in terms of the scientific quality of their research, as measured by field-normalised citation impact, tend to be those with higher levels of international collaboration.

Figure 6.25 also reinforces this point. Denmark, the Netherlands and Switzerland are among the top performers in OECD countries in terms of citation impact, with a normalised impact at least 30% higher than the OECD median for all indexed publications between 2012 and 2016. These countries were also among the OECD countries with relatively high levels of international collaboration between 2012 and 2016 (between 34% and 41% of all publications involved international collaboration). Belgium and Norway are also in the top right quadrant of Figure 6.25, indicating above average performance in both citation impact and international collaboration, while Estonia is near the median values for both measures.

Figure 6.26. Top 10% most-cited documents and patterns of international collaboration (2015)
Domestic and foreign-led top cited, as a percentage of all documents, fractional counts
Figure 6.26. Top 10% most-cited documents and patterns of international collaboration (2015)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

OECD calculations based on Scopus Custom Data, Elsevier, Version 4.2017; and 2015 SCImago Journal Rank from the Scopus journal title list (accessed June 2017), July 2017.

Source: Adapted from OECD (2017[13]), OECD Science, Technology and Industry Scoreboard 2017: The digital transformation, https://doi.org/10.1787/9789264268821-en.

 StatLink https://doi.org/10.1787/888933941709

The strength of the research performance of the Netherlands is further confirmed by the fact that it is only second to the United States in the percentage of top 10% most-cited documents led by a domestic author in 2015, either with or without international collaboration (Figure 6.26). Belgium had a similar percentage of top 10% most-cited documents led by a domestic author with international collaboration to the Netherlands (just over 3% in both countries), but had a smaller share of top cited publications with no international collaboration (8% compared to almost 10% in the Netherlands). Norway and Estonia had similar shares of most-cited documents led by a domestic author with and without international collaboration, both just above the OECD average levels.

Bilateral flows of researchers can help to further increase the impact of research. As discussed in Section 6.6, evidence suggests that authors who undertake research abroad and return to the economy (“returnees” in Figure 6.27) in which they first published contribute to raising the overall impact of domestic research. Authors who move abroad (“outflows”) tend to be associated with higher rated publications than their counterparts who remain in the country or return later. Authors who do not move abroad (“stayers”) are generally more likely to publish in lower ranked journals (OECD, 2017[13]).

The United States in somewhat exceptional in this regard; researchers who moved into the country (“new inflows”) had higher journal scores in 2016 than those who have stayed in the country throughout their career. However, United States-based authors who left the country and moved abroad had lower journal scores, as measured by the SCImago journal rank (Figure 6.27).

Figure 6.27. Expected citation impact of scientific authors, by mobility profile (2016)
Average 2015 SCImago Journal Rank (SJR) scores
Figure 6.27. Expected citation impact of scientific authors, by mobility profile (2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

OECD calculations based on Scopus Custom Data, Elsevier, Version 4.2017; and 2015 SCImago Journal Rank from the Scopus journal title list (accessed June 2017), July 2017.

Source: Adapted from OECD (2017[13]), OECD Science, Technology and Industry Scoreboard 2017: The digital transformation, https://doi.org/10.1787/9789264268821-en.

 StatLink https://doi.org/10.1787/888933941728

In the Netherlands, there was almost no difference between returnee, outflow or new inflow authors in 2016 in terms of the ranking of the journals where they publish (as measured by the SCImago journal rank score), although stayers had a lower journal score. On the other hand, in Norway returnees tended to publish in lower-ranked journals than the other groups of authors. In Belgium new inflows were the group who were able to publish most frequently in higher-ranked journals in 2016. Estonia had the widest range of scores between groups, and the largest difference between the expected citation impact of returnees and stayers (although these effects may also be due to the statistical variability produced by the smaller size of the research community in the country).

6.7.3. Turning research into innovation

Innovations can come about in a number of different ways, including as a result of research and development activities. The results of research projects can lead to knowledge that generates new ideas or inventions, which when implemented or diffused across society can be converted into impactful innovations (OECD/Eurostat, 2018[65]). In experimental development, the primary intention is to develop innovative processes or products, though other research and development activities can also strengthen individual or organisational capacities for innovation, even where innovation is not the primary objective of the research (OECD/Eurostat, 2018[65]).

Figure 6.28. PCT published applications by sector (2010-2016)
Percentage by sector and individuals
Figure 6.28. PCT published applications by sector (2010-2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Data include all Patent Co-operation Treaty applications which were published between 2010 and 2016. WIPO uses published applications for confidentiality reasons. Government and PROs are not calculated separately, they are aggregated into the same group.

Source: World Intellectual Property Organization (2010-2016[66]), PCT Yearly Review: The International Patent System, http://www.wipo.int/pct/en/activity/index.html.

 StatLink https://doi.org/10.1787/888933941747

When an organisation or research team develops an innovative idea, it is possible to legally protect their resulting intellectual property rights in various ways, including through patents and trademarks. Therefore, data on patent applications are often used as a proxy means of analysing innovative output. Data in Figure 6.28 cover all Patent Co-operation Treaty (PCT) patent applications which were published between 2010 and 2016 by sector and individuals. The vast majority of published applications originate in the business enterprise sector, followed by individuals; higher education, government and public research organisations generate smaller proportions of patents.

Patents can give an indication of how well expenditure on higher education research and development can be turned into innovative output. On average for OECD countries, fewer than 8% of patents are filed by the higher education sector, but the figures vary. For example, higher education accounts for more than one-quarter of published applications in Chile and Portugal, where the share of researchers in higher education in these countries is relatively high. On the other hand, the proportion of patents filed by the higher education sector is close to zero in Iceland and Sweden.

Figure 6.29. PCT published applications by higher education and government researchers (2010-2016)
Number per 100 researchers
Figure 6.29. PCT published applications by higher education and government researchers (2010-2016)

Note: *Participating in the Benchmarking Higher Education System Performance exercise 2017/2018.

Data include all Patent Co-operation Treaty applications which were published between 2010 and 2016. WIPO uses published applications for confidentiality reasons. Government and PROs are not calculated separately, they are aggregated into the same group.

Source: World Intellectual Property Organization (2010-2016[66]), PCT Yearly Review: The International Patent System, http://www.wipo.int/pct/en/activity/index.html.

 StatLink https://doi.org/10.1787/888933941766

In addition to the variability across countries, there are significant differences between the government and higher education sectors. In general, the average number of published patent applications by government researchers in OECD countries is larger than the number of published patent applications by higher education researchers. A notable example is Switzerland, with over 14 published applications per 100 government researchers between 2010 and 2016, compared to 5 per 100 higher education researchers (Figure 6.29). This may be explained by the fact that almost all R&D undertaken by the government sector in Switzerland is dedicated to applied research (OECD, 2017[13]).

Korea and Israel have the highest numbers of patents per 100 researchers from the higher education sector. The high productivity of researchers in Korea may be related to the fact that the majority of expenditure on R&D in higher education goes into applied research and experimental development. However, other factors may also be related to productivity, as for example while Israel also has a relatively high number of patents per 100 researchers, only about one-third of R&D funding in higher education is spent on applied research and experimental development (Figure 6.6).

In the Netherlands, there were 3.5 published applications for patents per 100 higher education researchers between 2010 and 2016, above the OECD average. In Belgium, there were 2.4 published applications per 100 researchers, slightly below the average; while Norway and Estonia were further below the average at close to 1.7 patents per 100 researchers each. The number of published patent applications for the government sector is also relatively high in Belgium and the Netherlands (around 3 per 100 researchers). On the other hand, the government sector in Estonia and Norway publishes relatively few patents, which could be related to the missions of public research institutes in these jurisdictions. For example, in Estonia, government research institutes that have remained outside the higher education sector tend to have other functions in addition to conducting research.

Despite the fact that Figure 6.29 indicates that the rate of patent applications from the higher education sector is relatively low overall, higher education research and development outputs may indirectly have a larger impact than it appears. For example, due to the legal situation in some countries, patents may be assigned to actors outside the higher education sector. Thus, the quantity of patent applications with higher education institutions as the origin but not the applicant remains largely unknown. In other cases, the higher education sector might create the knowledge which spurs patent applications. This influence is difficult to capture with existing metrics, although efforts have been made to identify relevant indicators, such as the number of patent applications filed by other sectors that cite academic papers. (The EUMIDA Consortium, 2010[67]).

Research and development in higher education also impacts more broadly on innovative processes through a number of other pathways as well as through patents. Through increased engagement-related activity, higher education institutions and systems are aiming to further enhance the social impact of research carried out in the higher education system. Chapter 7 explores some of the ways that higher education systems have been seeking to improve collaboration and create a more favourable environment for innovative processes.

6.7.4. Fostering research excellence in higher education

As discussed in previous sections, the quality of research can be assessed by considering the impact of research output on the work of other researchers, or by examining how well research can be turned into innovative products, services and technologies. While the discussion in the previous sections focuses on systemic performance, in reality, the highest impact research is concentrated not only within certain countries, but in a subset of institutions within those countries. In terms of vertical differentiation, high impact research is often most associated with the more elite research universities, and high research performance is essential for universities to achieve the “world-class” status of being ranked among the top universities globally.

The initial publication of the Academic Ranking of World Universities (ARWU) in 2003 by Shanghai Jiao Tong, followed closely by the Times QS World University Ranking in 2004 led to an almost immediate general acceptance of these metrics throughout the global higher education sector and sparked waves of policy initiatives at institutional, national and supranational level aimed at increasing standing in the rankings (Hazelkorn, 2009[68]).

Concern has been expressed about the narrow range of metrics used in the international institutional ranking, and the methodology used to compute them. For example, reputation surveys are a key input (see Chapter 2), which can be subject to manipulation and various biases (Bowman and Bastedo, 2011[69]). Rankings of individual institutions are sensitive to changes in indicators or weightings used, which limits their utility for students and policymakers and may result in sub-optimal choices if used as a basis for making decisions (Saisana and Saltelli, 2010[70]).

Despite concerns about the reliability of the rankings, the high weight attached to research impacts in these rankings, either through bibliometric indicators, the numbers of staff that have been awarded international prizes (Nobel Prize and Fields medal) for breakthrough research, or even indirectly through research reputation, helps to explain the increasing investment in higher education research in recent years by institutions, and a growing policy focus on research excellence.

In this competitive environment, research excellence initiatives have become commonplace across OECD countries and other countries that are heavily investing in producing research output and quality, such as China and the Russian Federation. A 2013 OECD survey of government ministries, to which 20 countries responded, identified 28 funding initiatives from 18 of the countries that met the criteria to be considered a Research Excellence Initiative (OECD, 2014[71]) .

Research excellence initiatives have been defined by the OECD as instruments that are designed to encourage outstanding research by providing large-scale, long-term funding to designated research units (often termed centres of excellence or CoEs). Many benefits of research excellence initiatives have been identified, including the enhanced ability of CoEs to attract and concentrate highly talented researchers in well-equipped environments, and providing security for carrying out broad and complex research agendas, especially for projects involving transdisciplinary research (OECD, 2014[71]) .

In the participating jurisdictions, many research excellence initiatives have been implemented:

  • The development of excellent academic communities is one of three core pillars in the Norwegian Long-term Plan for Research and Higher Education. The Research Council of Norway’s Centres for Excellence and Centres for Research-based Innovation are key mechanisms through which Norway supports higher education research excellence. Through these programmes, large tranches of funding are awarded to research clusters on a competitive basis, based on selection criteria which focuses on scientific quality and high international standards (OECD, 2017[72]).

  • The Flemish Community’s “VIS-scheme” (Flemish Cooperative Innovation Networks) has been responsible since 2001 for the creation of centres of excellence in the Flemish Community. Since 2009, many of these centres have been streamlined, consolidated or scaled up to become strategic research centres. More recently, the VIS-scheme has supported the development of Innovation Platforms, which provide a platform for the co-operation of various actors engaged in research in a particular industry. Many of the innovation initiatives are in the process of being updated following a new policy which focuses on strategic clustering of research actors (Flemish Department of Economy, Science and Innovation, 2017[6]).

  • The Netherlands promotes excellent research through the Gravitation Programme, which supports the formation of consortia of universities that have the potential to conduct ground-breaking scientific research of international importance, preferably leading to some breakthrough of global significance. The selection procedure is conducted by Netherlands Organisation for Scientific Research (NWO) (Dutch Ministry of Education, Culture and Science, 2014[8]).

  • In Estonia, the programme of the Centres of Excellence in Research (CoE) was introduced in 2001. A Centre of Excellence in Estonia consists of one or more internationally high-level research teams that have a clear set of common research objectives and work under the same management, with the aim of strengthening the international competitiveness and the quality of research, improving performance, ensuring future generations of researchers, intensifying national and international research co-operation between institutions and increasing the international impact of Estonian research (Estonian Ministry of Education and Research, 2017[73]).

6.8. Concluding remarks

This chapter provided a discussion of the available metric data related to the inputs, processes, outputs and outcomes of higher education research and development, as well as a more in-depth analysis of relevant policies and practices in the four participating jurisdictions. In this section, key messages of this chapter are outlined, along with an overview of areas where additional data would provide benefits for assessing the performance of the research function in higher education.

  • The key justification for investment in research and development is that it underpins the creation of new knowledge that is needed to develop future innovations. With that in mind, OECD governments are aiming to increase the level of investment in research as a proportion of GDP, as well as broaden the range of sources for R&D investment. As discussed in Chapters 3 and 4, a clearer delineation between the resources (human and financial) invested in education and research would allow for a more robust analysis of the efficiency and cost-effectiveness of the research and development activities of higher education systems.

  • Ensuring access to a rewarding career in research is a core requirement for building and sustaining a high-performing research and development system. More comprehensive and reliable data on the different types of researchers within the higher education system and in the private sector, their socio-demographic characteristics and the different stages of their careers would provide a greater understanding of how government policy could support the needs of R&D systems for high-quality human resources, through, for example, identifying mismatches between field of studies and sector of employment, understanding employment conditions in research oriented occupations within and outside academia, and monitoring transition paths in and out of academia.

  • Bibliometric data is currently the only means by which to conduct comparative metric analysis across countries of the quality and impact of research. It is also the best available data source for inferring information about the flow of researchers between jurisdictions, and the effect that this has on research quality. However, there are a number of conceptual and methodological challenges associated with using bibliometric data. While there is no obvious alternative at present, it is likely, given the growth in research activity in recent years across the OECD, that there will be increasing interest in developing a broader and more reliable range of indicators to measure research impact.

  • In addition to the metric data presented in this chapter, a number of national policies and practices in the participating jurisdictions are motivated by improving various aspects of the research function in higher education. A summary of some of the initiatives presented in this chapter is given in Table 6.4.

Table 6.4. Selected higher education policies from the participating jurisdictions (2017)

 

Motivation

Policies

Estonia

Increasing the internationalisation of research

  • The Dora Plus and Mobilitas Plus programmes have been established to attract students and researchers from abroad, improve Estonia’s reputation as a destination for research and expand transnational collaboration opportunities. Among other supports, the Dora programmes provide scholarships for international students for study visits to Estonia and supports to higher education institutions in Estonia to organise short-term courses for international study groups. Initiatives under Mobilitas Plus include post-doctoral research grants for researchers coming from abroad, and retuning researcher grants for researchers returning to Estonia after completing some research abroad.

  • Estonia also participates actively in many international research projects and initiatives, including the European Molecular Biology Conference (EMBC), European Space Agency (ESA), European Spallation Source (ESS) and the European Organization for Nuclear Research (CERN).

  • Estonia has relatively high Horizon 2020 funding as a percentage of GDP among the jurisdictions.

The Flemish Community

Improving and streamlining investment in R&D

  • The Flemish Community has brought investment in R&D to a level of 2.5% of GDP, with the target of reaching 3% by 2020.

  • Funding mechanisms include ‘Special Research Funds’ (BOF), which are awarded based on the number of master’s and doctoral degrees awarded, gender diversity, and research productivity and impact. Institutions can also benefit from ‘Industrial Research Funds’ (IOF) if they engage in technology transfer activities, such as licensing, patenting and spin-offs.

  • The Flemish Community is among the jurisdictions most successful at attracting funding from Horizon 2020.

The Netherlands

Creating world-class, high-impact research

  • The Gravitation Programme supports the formation of consortia of universities that have the potential to conduct ground-breaking scientific research of international importance, preferably leading to some breakthrough of global significance.

  • Standard evaluation protocols (SEP) are used to monitor the quality of research.

Norway

Developing flexible ways to access a career in research

  • State institutions and private institutions carry out doctoral research.

  • Researchers are treated as employees and receive social benefits.

  • Public sector organisations and businesses that allow their employees to complete a doctorate in their area of work are entitled to financial support from the Research Council of Norway.

  • Norway participates in international joint doctoral supervision projects (cotutelle).

Source: Adapted from information provided by the participating jurisdictions. See the reader's guide for further information.

References

[22] Altbach, P., L. Reisberg and L. Rumbley (2009), Trends in Global Higher Education: Tracking an Academic Revolution, UNESCO, Paris, https://unesdoc.unesco.org/ark:/48223/pf0000183168 (accessed on 30 August 2018).

[27] Ampaw, F. et al. (2012), “Completing the Three Stages of Doctoral Education: An Event History Analysis”, Research in Higher Education, Vol. 53, pp. 640-660, https://doi.org/10.1007/s11162-011-9250-3.

[37] Barnett, J. et al. (2017), “A comparison of best practices for doctoral training in Europe and North America”, FEBS openbio, pp. 1444-1452, https://doi.org/10.1002/2211-5463.12305.

[23] Borgonovi, F. et al. (2018), Empowering Women in the Digital Age; Where Do We Stand?, OECD, Paris, https://www.oecd.org/social/empowering-women-in-the-digital-age-brochure.pdf (accessed on 30 August 2018).

[56] Bornmann, L. (2013), “What Is Societal Impact of Research and How Can It Be Assessed? A Literature Survey”, Journal of the American Society for Information Science and Technology, Vol. 64/2, pp. 217-233, https://doi.org/10.1002/asi.22803.

[69] Bowman, N. and M. Bastedo (2011), “Anchoring Effects in World University Rankings: Exploring Biases in Reputation Scores”, Higher Education, Vol. 61, pp. 431-444, https://doi.org/10.1007/s10734-010-9339-1.

[41] Castelló, M. et al. (2017), “Why Do Students Consider Dropping Out of Doctoral Degrees? Institutional and Personal Factors”, Higher Education, Vol. 74/6, pp. 1053-1068, https://doi.org/10.1007/s10734-016-0106-9.

[26] Chiteng Kot, F. and D. Hendel (2012), “Emergence and Growth of Professional Doctorates in the United States, United Kingdom, Canada and Australia: A Comparative Analysis”, Studies in Higher Education, pp. 345-364, https://doi.org/10.1080/03075079.2010.516356.

[9] Dutch Ministry of Education, Culture and Science (2019), Curious and Committed: The Value of Science, Dutch Ministry of Education, Culture and Science, Den Haag, https://www.government.nl/documents/policy-notes/2019/01/28/curious-and-committed---the-value-of-science.

[10] Dutch Ministry of Education, Culture and Science (2015), The Value of Knowledge: Strategic Agenda for Higher Education and Research 2015-2025, Dutch Ministry of Education, Culture and Science, Den Haag, https://www.government.nl/documents/reports/2015/07/01/the-value-of-knowledge (accessed on 22 August 2018).

[8] Dutch Ministry of Education, Culture and Science (2014), 2025 - Vision for Science Choices for the Future, Dutch Ministry of Education, Culture and Science, Den Haag, https://www.government.nl/topics/science/documents/reports/2014/12/08/2025-vision-for-science-choices-for-the-future (accessed on 7 August 2018).

[73] Estonian Ministry of Education and Research (2017), Base funding and centres of excellence, Estonian Ministry of Education and Research, Tallinn, https://www.hm.ee/en/activities/research-and-development/base-funding-and-centres-excellence (accessed on 26 November 2018).

[12] Estonian Ministry of Education and Research (2014), Knowledge-based Estonia: Estonian Research and Development and Innovation Strategy 2014-2020, Estonian Ministry of Education and Research, Tallinn, https://www.hm.ee/sites/default/files/estonian_rdi_strategy_2014-2020.pdf (accessed on 22 August 2018).

[19] European Commission (2018), Horizon 2020 in full swing- Three Years On - Key facts and figures 2014-2016, Publications Office of the European Union, Luxembourg, https://dx.doi.org/10.2777/778848.

[60] European Commission (2018), Turning FAIR into reality, Publications Office of the European Union, Luxembourg, https://doi.org/10.2777/1524.

[62] European Commission (2017), Regional Innovation Scoreboard 2017, Publications Office of the European Union, Luxembourg, https://doi.org/10.2873/593800.

[20] European Strategy Forum on Research Infrastructures Long-Term Sustainability Working Group (2017), Long-Term Sustainability of Research Infrastructures, Dipartimento di Fisica - Università degli Studi di Milano, Milano, https://ec.europa.eu/research/infrastructures/pdf/esfri/publications/esfri_scripta_vol2.pdf (accessed on 27 August 2018).

[28] Eurydice (2018), National Education Systems, Eurydice, Brussels, https://eacea.ec.europa.eu/national-policies/eurydice/home_en (accessed on 15 December 2018).

[29] Eurydice (2017), Modernisation of Higher Education in Europe: Academic Staff - 2017, Publications Office of the European Union, Luxembourg, https://doi.org/10.2797/408169.

[34] Eurydice (2016), Estonia: Higher Education, https://webgate.ec.europa.eu/fpfis/mwikis/eurydice/index.php/Estonia:Higher_Education (accessed on 16 March 2018).

[35] Eurydice (2014), Netherlands: Higher Education, https://webgate.ec.europa.eu/fpfis/mwikis/eurydice/index.php/Netherlands:Higher_Education (accessed on 16 March 2018).

[36] Eurydice (2011), Norway: Higher Education, https://webgate.ec.europa.eu/fpfis/mwikis/eurydice/index.php/Norway:Higher_Education (accessed on 16 March 2018).

[6] Flemish Department of Economy, Science and Innovation (2017), STI in Flanders: Science, Technology and Innovation. Policy and Key Figures - 2017, Flemish Department of Economy, Science and Innovation, Brussels, http://www.vlaanderen.be/nl/publicaties/detail/sti-in-flanders-science-technology-amp-innovation-policy-amp-key-figures-2017 (accessed on 22 August 2018).

[11] Flemish Government (2014), Beleidsnota 2014-2019 - Werk, Economie, Wetenschap en Innovatie [Policy Paper 2014-2019 - Work, Economy, Science and Innovation], Flemish Government, Brussels, https://www.vlaanderen.be/nl/publicaties/detail/beleidsnota-2014-2019-werk-economie-wetenschap-en-innovatie (accessed on 22 August 2018).

[54] Garfield, E. (2003), “The Meaning of the Impact Factor”, International Journal of Clinical and Health Psychology, Vol. 3/2, pp. 363-369, http://www.garfield.library.upenn.edu/essays/v13p185y1990.pdf (accessed on 25 November 2018).

[14] Georghiou, L. (2015), Value of Research: Policy Paper by the Research, Innovation, and Science Policy Experts (RISE), European Commission, Brussels, https://ec.europa.eu/research/openvision/pdf/rise/georghiou-value_research.pdf (accessed on 27 June 2018).

[53] Glänzel, W. and H. Moed (2002), “Journal Impact Measures in Bibliometric Research”, Scientometrics, Vol. 53/2, pp. 171-193, https://doi.org/10.1023/A:1014848323806.

[68] Hazelkorn, E. (2009), “Rankings and the Battle for World-Class Excellence: Institutional Strategies and Policy Choices”, Higher Education Management and Policy, Vol. 21, https://doi.org/10.1787/hemp-v21-art4-en (accessed on 30 August 2018).

[52] Hirsch, J. (2005), “An Index to Quantify an Individual’s Scientific Research Output.”, Proceedings of the National Academy of Sciences of the United States of America, Vol. 102/46, pp. 16569-16572, https://doi.org/10.1073/pnas.0507655102.

[57] Ioannidis, J. (2017), “Acknowledging and Overcoming Nonreproducibility in Basic and Preclinical Research”, JAMA, Vol. 317/10, https://doi.org/10.1001/jama.2017.0549.

[42] Jairam, D. and D. Kahl (2012), “Navigating the Doctoral Experience: The Role of Social Support in Successful Degree Completion”, International Journal of Doctoral Studies, Vol. 7, pp. 311-329, https://pdfs.semanticscholar.org/8e86/769ea8f1df1235e9ee493296f9d24882b312.pdf (accessed on 25 November 2018).

[18] Jonkers, K. and T. Zacharewicz (2016), Research Performance Based Funding Systems: a Comparative Assessment, Publications Office of the European Union, Luxembourg, https://doi.org/10.2760/70120.

[4] Kattel, R. and B. Stamenov (2017), RIO Country Report 2016: Estonia, Publications Office of the European Union, Luxembourg, https://doi.org/10.2760/259046.

[61] KNAW (2018), Replication Studies - Improving Reproducibility in the Empirical Sciences, Royal Netherlands Academy of Arts and Sciences (KNAW), Amsterdam, https://knaw.nl/en/news/publications/replication-studies (accessed on 26 November 2018).

[40] Litalien, D. and F. Guay (2015), “Dropout Intentions in PhD Studies: A Comprehensive Model Based on Interpersonal Relationships and Motivational Resources”, Contemporary Educational Psychology, Vol. 41, pp. 218-231, https://doi.org/10.1016/j.cedpsych.2015.03.004.

[59] McNutt, M. (2014), “Reproducibility”, Science, Vol. 343, p. 229, https://doi.org/10.1126/science.1250475.

[44] Meneghini, R. and A. Packer (2007), “Is There Science Beyond English? Initiatives to Increase the Quality and Visibility of non-English Publications Might Help to Break Down Language Barriers in Scientific Communication”, EMBO reports, Vol. 8/2, pp. 112-116, https://doi.org/10.1038/sj.embor.7400906.

[58] Munafò, M. et al. (2017), “A Manifesto for Reproducible Science”, Nature Human Behaviour, Vol. 1, https://doi.org/10.1038/s41562-016-0021.

[7] Norwegian Ministry of Education and Research (2018), Meld. St. 4. Melding til Stortinget: Langtidsplan for Forskning og Høyere Utdanning 2019–2028 [Report to the Storting (white paper): Long-term Plan for Research and Higher Education 2019–2028], Norwegian Ministry of Education and Research, Oslo, https://www.regjeringen.no/no/dokumenter/meld.-st.-4-20182019/id2614131/ (accessed on 7 August 2018).

[31] OECD (2018), Education at a Glance 2018: OECD Indicators, OECD Publishing, Paris, https://doi.org/10.1787/eag-2018-en.

[32] OECD (2018), OECD Education Statistics, OECD Publishing, Paris, https://doi.org/10.1787/edu-data-en (accessed on 20 December 2018).

[16] OECD (2018), OECD Science, Technology and R&D Statistics, https://doi.org/10.1787/strd-data-en (accessed on 20 December 2018).

[38] OECD (2017), Education at a Glance 2017: OECD Indicators, OECD Publishing, Paris, https://dx.doi.org/10.1787/eag-2017-en.

[72] OECD (2017), Norway 2017, OECD Reviews of Innovation Policy, OECD Publishing, Paris, https://doi.org/10.1787/9789264277960-en.

[13] OECD (2017), OECD Science, Technology and Industry Scoreboard 2017: The Digital Transformation, OECD Publishing, Paris, https://doi.org/10.1787/9789264268821-en.

[25] OECD (2017), OECD Skills Outlook 2017: Skills and Global Value Chains, OECD Publishing, Paris, https://doi.org/10.1787/9789264273351-en.

[50] OECD (2017), The Knowledge Triangle: Enhancing the Contributions of Higher Education and Research Institutions to Innovation, OECD, Paris, https://www.oecd.org/innovation/knowledge-triangle.htm (accessed on 5 May 2018).

[48] OECD (2016), “Belgium”, in OECD Science, Technology and Innovation Outlook 2016, OECD Publishing, Paris, https://doi.org/10.1787/sti_in_outlook-2016-48-en.

[33] OECD (2016), Education at a Glance 2016: OECD Indicators, OECD Publishing, Paris, https://dx.doi.org/10.1787/eag-2016-en.

[46] OECD (2016), Enhancing Research Performance through Evaluation, Impact Assessment and Priority Setting, OECD, Paris, https://www.oecd.org/sti/inno/Enhancing-Public-Research-Performance.pdf (accessed on 29 August 2018).

[47] OECD (2016), “Estonia”, in OECD Science, Technology and Innovation Outlook 2016, OECD Publishing, Paris, https://doi.org/10.1787/sti_in_outlook-2016-58-en.

[49] OECD (2016), “Netherlands”, in OECD Science, Technology and Innovation Outlook 2016, OECD Publishing, Paris, https://doi.org/10.1787/sti_in_outlook-2016-77-en.

[45] OECD (2016), “Norway”, in OECD Science, Technology and Innovation Outlook 2016, OECD Publishing, Paris, https://doi.org/10.1787/sti_in_outlook-2016-79-en.

[3] OECD (2016), OECD Science, Technology and Innovation Outlook 2016, OECD Publishing, Paris, https://dx.doi.org/10.1787/sti_in_outlook-2016-en.

[1] OECD (2015), Frascati Manual 2015: Guidelines for Collecting and Reporting Data on Research and Experimental Development, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264239012-en.

[17] OECD (2015), OECD Science, Technology and Industry Scoreboard 2015: Innovation for Growth and Society, OECD Publishing, Paris, https://doi.org/10.1787/sti_scoreboard-2015-en.

[2] OECD (2015), The Innovation Imperative: Contributing to Productivity, Growth and Well-Being, OECD Publishing, Paris, https://doi.org/10.1787/9789264239814-en.

[5] OECD (2014), OECD Reviews of Innovation Policy: Netherlands 2014, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264213159-en.

[15] OECD (2014), OECD Science, Technology and Industry Outlook 2014, OECD Publishing, Paris, https://doi.org/10.1787/sti_outlook-2014-en (accessed on 22 August 2018).

[71] OECD (2014), Promoting Research Excellence: New Approaches to Funding, OECD Publishing, Paris, https://doi.org/10.1787/9789264207462-en.

[43] OECD (2013), Mapping Careers and Mobility of Doctorate Holders: Draft Guidelines, Model Questionnaire and Indicators - Third Edition, OECD Publishing, Paris, https://doi.org/10.1787/18151965.

[21] OECD (2008), Tertiary Education for the Knowledge Society: Volume 1 and Volume 2, OECD Publishing, Paris, https://doi.org/10.1787/9789264046535-en.

[55] OECD and SCImago Research Group (2016), Compendium of Bibliometric Science Indicators, OECD, Paris, http://oe.cd/scientometrics. (accessed on 29 August 2018).

[65] OECD/Eurostat (2018), Oslo Manual 2018: Guidelines for Collecting, Reporting and Using Data on Innovation, 4th Edition, The Measurement of Scientific, Technological and Innovation Activities, OECD Publishing, Paris/Eurostat, Luxembourg, https://dx.doi.org/10.1787/9789264304604-en.

[24] OECD/Eurostat/UNESCO Institute for Statistics (2015), ISCED 2011 Operational Manual: Guidelines for Classifying National Education Programmes and Related Qualifications, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264228368-en.

[30] Research Council of Norway (2019), Funding from the Research Council, https://www.forskningsradet.no/en/apply-for-funding/funding-from-the-research-council/ (accessed on 29 April 2019).

[70] Saisana, M. and A. Saltelli (2010), “Rickety Numbers: Volatility of University Rankings and Policy Implications”, Research Policy, Vol. 40, pp. 165-177, https://doi.org/10.1016/j.respol.2010.09.003.

[64] Sivertsen, G. (2016), Publication-Based Funding: The Norwegian Model, Springer International Publishing, Cham, https://doi.org/10.1007/978-3-319-29016-4_7.

[63] Tahamtan, I., A. Safipour Afshar and K. Ahamdzadeh (2016), “Factors affecting number of citations: a comprehensive review of the literature”, Scientometrics, Vol. 107/3, pp. 1195-1225, https://doi.org/10.1007/s11192-016-1889-2.

[67] The EUMIDA Consortium (2010), Feasibility Study for Creating a European University Data Collection, European Commission, Brussels, https://ec.europa.eu/research/era/docs/en/eumida-final-report.pdf.

[39] Van Der Haert, M. et al. (2013), “Are Dropout and Degree Completion in Doctoral Study Significantly Dependent on Type of Financial Support and Field of Research?”, Studies in Higher Education, Vol. 39, pp. 1885-1909, https://doi.org/10.1080/03075079.2013.806458.

[51] Wilsdon, J. et al. (2015), The Metric Tide: Report of the Independent Review of the Role of Metrics, HEFCE, Bristol, https://doi.org/10.13140/RG.2.1.4929.1363.

[66] World Intellectual Property Organization (2010-2016), PCT Yearly Review: The International Patent System, World Intellectual Property Organization, http://www.wipo.int/pct/en/activity/index.html.

Notes

← 1. In some countries, there is no material difference between the policies or funding systems in the higher education and government sectors. For example, in Estonia, the same rules of funding apply for government, higher education and private non-profit sectors, independent of their legal status.

← 2. It should be noted that these data cover all sectors of R&D and are not specifically tailored to higher education. However, as researchers in higher education have the most incentive to publish their work in indexed publications, it could be expected that the measures are at least of this magnitude in higher education.

← 3. As indicated by the SCImago Journal Rank, a measure of scientific influence of scholarly journals that accounts for both the number of citations received by a journal and the importance or prestige of the journals where the citations are made (OECD, 2017[13]).

End of the section – Back to iLibrary publication page