Chapter 8. New trends in public research funding

Philippe Larrue
Dominique Guellec
Frédéric Sgard

The statistical data for Israel are supplied by and under the responsibility of the relevant Israeli authorities. The use of such data by the OECD is without prejudice to the status of the Golan Heights, East Jerusalem and Israeli settlements in the West Bank under the terms of international law.

What types of funding mechanisms and instruments should finance what type of research and for what effects? Despite progress in understanding the underlying dynamics, research funding is still the subject of lively discussions in academic and policy arenas.1

The various positions in these debates, often revolving around the two models of competitive and non-competitive funding, are entrenched in different conceptual views on how new knowledge is generated and used in the innovation process. They also reflect various communities’ vested interests since the responses given to this question influence the allocation of funds to different actors. Finally, they are strongly related to the national institutional set-ups in which the funding systems are embedded, adding a further layer of complexity to the debate.

These policy debates have become more intricate as the boundaries between the formerly two well-established modes of research funding – competitive and non-competitive – have become increasingly blurred and porous. On the one hand, competitive funding can be allocated to certain institutions – particularly centres of excellence – for a period of several years; on the other hand, institutional funding increasingly integrates performance-based components, introducing a degree of competition into these funding mechanisms.

Reflecting changes in the policy arena, an extensive academic and grey literature has progressively moved away from the usual dichotomy between competitive and non-competitive funding instruments, introducing more nuanced measurement and comparison of national funding patterns. Scholars and experts also scrutinise the operational/technical aspects of the different funding instruments (e.g. the components of the funding formula for institutional funding, and the criteria and selection modes for competitive funding). This body of work now offers a richer understanding of the funding landscape, more closely related to the reality experienced by policymakers.

However, little is yet known about the effects of funding instruments. What are the merits of the various instruments (and their multiple design variants) in achieving certain policy objectives, including supporting research excellence, steering research in certain directions or triggering breakthroughs? Although they do not provide systematic responses to this question, various country reviews, evaluations of schemes and programmes supporting research, and research works provide some useful insights on this matter. Together, they help shed light on the “purpose fit” of instruments, i.e. how certain instruments are more or less adapted to specific policy objectives. They also provide a significant – though scattered – evidence base on the various factors influencing the desired effects at the different stages of the funding process, from high-level strategic orientation to research implementation in Higher education institutions (HEIs) and public research institutes (PRIs).

Connecting the technical (“how to fund?”) and political (“for what desired effects?”) aspects of research funding is essential, to help policy makers design and use funding instruments in a way that best corresponds to their objectives. This chapter builds on recent progress in the academic and empirical literature, analysing the policy objectives and desired effects underlying the different types of government research funding. The OECD has recently resumed work in this field (OECD, forthcoming a) and future OECD work on research funding is planned for the 2019-20 biennium.

The chapter takes stock of recent changes in the allocation modes of research funding. It examines the increasingly complex set of funding instruments designed to convey a widening set of policy objectives, and proposes a simple analytical framework of the mix of these funding instruments as a continuum. Regarding the purpose fit of funding instruments, the chapter pays particular attention to performance-based institutional funding instruments, which have undergone recent reforms in many countries and offer new policy levers to accommodate a wide set of policy objectives. It concludes with a forward-looking view, drawing implications for future analytical work and discussing how emerging long-term trends (e.g. digitalisation and societal challenges) might influence the volume and types of research funding.

Innovation, particularly at the knowledge frontier and in emerging sectors, depends heavily on scientific progress (OECD, 2015a). HEIs and PRIs – which in 2016 represented just under 18% (HEIs) and 11% (PRIs) of gross domestic expenditure on research and development (GERD) in OECD member countries, far below business (69%) – perform more than three-quarters of total basic research.

HEIs play a growing role in research and development (R&D), surpassing PRIs, whose importance has decreased in many countries. In addition to providing higher education, universities are strongly engaged in the production of longer-term and higher-risk scientific knowledge, and increasingly in applied research, knowledge transfer and innovation activities.

Despite considerable country differences, government sources finance the bulk of academic research activities: in 2015, public funds supported 67% of academic research by HEIs and 92% of research by PRIs (OECD, 2017a). Budgetary restrictions in the aftermath of the 2008 global financial crisis negatively affected R&D funding (Box 8.1). However, research will remain an important component of public budgets, as the level of knowledge embedded in products and services keeps increasing, and the number of global challenges calling for radically new technological and social innovation also keeps rising.

Research funding is allocated in very diverse ways, reflecting the institutional settings of national research systems. The earliest and simplest typology distinguishes between competitive and non-competitive funding mechanisms:

  • Competitive project funding encompasses the programmes or instruments of funding agencies, research councils or ministries that allocate resources for a research activity limited in scope, budget and time, based on formal contests or competitions, in which applicants apply for funding. Financial awards can be of variable size and length, and may be allocated to individuals, projects or centres (OECD, forthcoming a).

  • Non-competitive institutional funding includes institutional core or block funding, i.e. the general funding of research-performing institutions, without direct selection of R&D projects or programmes. It is generally allocated as a yearly government contribution to HEIs or PRIs (not to a specific sub-component or research group) to fund their day-to-day operations, such as staff salaries, infrastructure and maintenance related to education or research activities. While institutional funding was earmarked in the past for specific activities, it is now mostly allocated as a lump sum (block grant) that research institutions can spend as they see fit (OECD, 2015b; Jongbloed and Lepori, 2015).

The changing ways in which most governments allocate research funding have increasingly blurred the formerly well-established boundaries between the two major funding mechanisms in the two last decades. First, the gradual spread of new public management (NPM) thinking in many public administrations (including HEIs and PRIs in the 1980s and 1990s), and the growing pressure on budgets, have led public authorities to increase the share of research funds distributed through competitive project funding (Hicks, 2010). Furthermore, not only did NPM reforms further increase project-based funding, they also introduced performance-based variables and different conditions for institutional funding allocated to HEIs and PRIs (Lepori, Geuna and Mira, 2007). In some countries (e.g. Sweden) and institutions (e.g. PRIs in Norway), attempts have been made to include strategic components in institutional funding, in order to better align research activities and national priorities while preserving institutional autonomy. As a result of these changes, institutional funding (which still often retains a strong historical component) can no longer be considered non-competitive and non-oriented.

An even more recent trend has also challenged the previously binary typology of funding mechanisms. Governments increasingly use competitions to allocate multi-year funding to institutions (or part of them) through different types of research excellence initiatives (REIs). These initiatives aim to encourage outstanding research by allocating large-scale, long-term funding directly to designated research units; hence, they feature elements of both institutional and project funding. In 2014, over two-thirds of OECD countries were operating such schemes, mostly established within the past decade (OECD, 2014a). The 2017 edition of the European Commission EC-OECD science, technology and innovation policy (STIP) survey showed similar results: 31 countries (i.e. 61% of a total of 51 countries)2 reported 84 initiatives using these funding instruments (EC/OECD, 2017).

The evolution of the funding landscape has challenged the boundaries between competitive and non-competitive funding instruments, requiring “nuanced” conceptual frameworks. Several initiatives – mainly commissioned by the European Commission and the OECD since the early 2000s – have attempted to clarify the definitions of instruments in this moving landscape and reflect these observations in precise statistics (Box 8.2).

Considering these changes and looking more closely at the diverse variants of funding instruments requires reconsidering the dichotomy between non-competitive and competitive funding as a continuum (Dialogic and Empirica, 2014). Based on progress made over the last decade in understanding research funding, this chapter proposes a simple analytical framework to present the portfolio of research-funding instruments available to policy makers along multiple and continuous – rather than unique and binary – dimensions.

These dimensions, as well as the main parameters influencing the positioning of the different funding instruments along them, are discussed below:

  • Competition intensity: competition is more intense when the number of applicants is large for a given total available budget. Since funders themselves often have little margin to augment the overall budget dedicated to a given funding stream, the size of the targeted population will be the main lever in their hands to manage competition intensity. Hence, the scope of the calls for proposals in project funding, the eligibility rules for institutional funding (e.g. targeting only research universities), together with factors affecting the selection rate and concentration of the distributed funds, are key determinants that intensify competition.

  • Granularity: the selection/allocation unit can be an entire institution, part of an institution (e.g. a faculty), or a project or programme of different sizes and scope. This has important implications in terms of the scope and flexibility of the allocation, its stability, the level of fragmentation of the funding, etc.

  • Level: competition can also involve different levels within a single organisation, depending on the elementary units of allocation and assessment. These two units may not coincide, e.g. in the case of institutional funding, where the assessment is performed at the level of departments or research groups, with funding allocated to the organisation as a whole. Depending on internal allocation rules, competition between institutions can translate internally into rivalry between and within parts of these institutions.

  • Type of assessment and selection criteria: competition can be based on a wide array of criteria, using different timeframes for assessment. Selection/allocation criteria range from publications and citations, to third-party funding and expected social impact. Simplistically, a distinction can be made between input and output-related performance criteria. These criteria can be considered within timeframes with different durations (number of years) and directions (ex ante and/or ex post).

  • Orientation/directionality: funding allocation can be open, or targeted towards priority areas or issues (e.g. scientific disciplines, economic or societal problems). The more granular and ex ante the allocation, the easier it is for policymakers to steer funding in selected directions.

Figure 8.3 schematises the mix of funding instruments as a continuum along three of the above dimensions. Although not all countries have implemented the full range of instruments, many of these overlap or accumulate. For instance, performance-based funding is almost always provided on top of historical block funding, to allow some stability in funding allocation over time. Similarly, performance contracts are most often coupled with an (ex-post) performance-based component. Therefore, the relative weights of the different funding components (e.g. the research performance-based component in Norway only affects 15% of the total block funding), and their possible synergistic effects, are an important variable when defining a national funding portfolio.

What are the different funding instruments, with their multiple design variants, “good at”? As previously shown (Box 8.2), considerable conceptual, data-collection and case-study work has generated important progress in characterising and measuring research-funding trends over the last two decades. However, knowledge and evidence on the effects of research-funding mechanisms is much scarcer. A key preliminary step in assessing the effects of funding instruments consists in analysing their purpose fit, i.e. determining what policy instruments fit what objectives. This also reconnects the knowledge gained on instruments with the challenges facing policymakers as they attempt to respond to the mounting societal expectations of public research, far beyond a sole focus on scientific excellence.

Each instrument conveys an ever-widening range of policy objectives as new social needs arise, with more programmes stating multiple goals. A recent OECD project identified the desired effects most frequently stated in a dedicated questionnaire covering 75 competitive funding programmes from 21 countries (OECD, forthcoming a). The study distinguished between two sets of “internal” and “external” desired effects (Figure 8.4). Although they were not covered in the study, a similar array of objectives would probably apply to institutional funding instruments, albeit in different proportions.

This trend toward more programmes stating multiple goals results in more complex policy-instrument designs to accommodate these various objectives (Jongbloed and Lepori, 2015). For instance, new “mixed” or “hybrid” funding model instruments have been introduced, either by adding competition and performance requirements to formerly “fixed” instruments, or adding more strategic and longer-term components in competitive schemes (e.g. REIs).

The increasingly complex design of instruments also offers many levers to make them more “amenable” to fulfilling different policy objectives. Table 8.1 describes how the design features of three main “families” of instruments can be fine-tuned to accommodate different policy objectives.

It focuses on the three most frequent and comprehensive types of desired effects: enhancing research excellence; steering research towards specific priorities; and creating the conditions for breakthroughs.

The section below briefly reviews the main elements of institutional funding, project-based funding and funding of REIs against these three types of desired effects.

  • Institutional funding focuses on maintaining a stable research infrastructure and underpinning longer-term “excellent” research. As “formal” selection is generally absent from this type of allocation, and academic institutions are entitled to use the funding as they see fit (serving the principle of academic freedom), it is generally not considered amenable to steering research towards specific national priorities. However, some of the initiatives presented in Box 8.3 show that institutional funding can create the appropriate conditions and incentives for researchers to engage in targeted research, providing the necessary strategic capabilities are present at the top level of the beneficiary institutions. They illustrate three main ways to steer research activities through institutional funding: “top-slicing” block grants to target specific priorities; providing additional earmarked institutional funding (either through direct negotiation or competitive awards) for large multi-year projects aligned with national priorities; and using performance contracts to help research institutions build up their profile in fields of national interest. If these initiatives are designed appropriately, and specific conditions are in place to promote co-operation between institutions (as with the Swedish Strategic research areas [SFO] programme), they could also serve the objective of creating breakthrough research.

  • Project funding consists of allocating funds to groups or individuals to perform specific R&D activities, mostly based on a project proposal subjected to a competitive process. Project funding is considered a better policy tool to steer research, particularly with a view to producing higher-quality research and (to a lesser extent) research that is more relevant to socio-economic objectives (Hicks, 2010). By contrast, many studies have highlighted that an increasing reliance on competitive funding can result in shorter-term, lower-risk projects, rather than longer-term, higher-risk research, although the evidence for this is mixed5. Moreover, the resource and time burdens of applying for and reviewing competitive grants can deter some of the best researchers from participating. Finally, project funding hinders the ability of researchers and institutions to engage in long-term planning, because of uncertain future funding. This is especially true for project-based funding with low success rates. Policymakers have experimented with a few alternatives, such as “lotteries” and “sandpits” (OECD, forthcoming a).

  • REIs provide the selected centres with relatively long-term resources, thereby allowing them in principle to carry out (as their name suggests) excellent research. REIs often include researchers and infrastructures from different institutions, hence promoting the interdisciplinary and co-operative context necessary for high-impact, high-risk “breakthrough” research (OECD, 2014a).

Among the different policy objectives, the issue of how the different funding instruments support breakthrough research is attracting growing attention, particularly in light of rising concerns about the seemingly decreasing productivity of research (Bloom et al., 2017). As previously mentioned, the research community has expressed concerns that competitive funding mechanisms could disadvantage risky, potentially transformative, or transdisciplinary research proposals in favour of applied, incremental, or disciplinary proposals. Indeed, reconciling both a desire for more efficient and transparent research funding with the need to support more innovative (but also riskier) projects poses a real challenge.

Studies on this topic provide recommendations on how to design instruments to fund breakthrough research (e.g. Laudel and Gläser, 2014; Wang, Lee and Walsh, 2018). Some studies recommend tailoring funding mechanisms to the need for creativity in science, rather than simply adding criteria to existing project-funding schemes. Others claim that competitive funding can support breakthrough research, providing it is specifically adapted to this strategic objective (Heinze, 2008; Goldstein and Narayanamurti, 2018). The Japanese Government, for instance, announced that the number of selection panels in the main competitive instrument (the Grants-in-Aid for Scientific Research programme, “kakenhi”) will drop from close to 500 to around 375, to foster research originality and creativity (Hornyak, 2017). The increase in competitive funding has been blamed for a markedly increased concentration of basic-research funding in the hands of a small number of Japanese institutions; this loss of diversity is detrimental to novelty and alternative scientific ideas (Matsuo, 2018).

Considerable conceptual, data-collection and case-study work has generated important progress in characterising and measuring research-funding trends over the last two decades. The increasing diversity of design variants for funding instruments offers policymakers new levers to accommodate a widening set of policy needs. However, knowledge on the effects of research-funding mechanisms is far scarcer, notably owing to many methodological problems (Butler, 2010). This chapter has proposed a conceptual framework to represent the new research-funding landscape and analyse which policy instruments (and their design variants) can theoretically fulfil different policy objectives. However, this analysis of the ‘purpose fit’ of funding instruments is still in its infancy and will be the object of more work in the near future to assess how policy makers can best fund research to realise their priorities.

Pushing this research agenda further will require going beyond an “instrument-by-instrument” analysis, to examine the instruments’ combined effects and interactions with the institutional environment:

  • Competitive and non-competitive funding interact in several ways, exhibiting both positive complementarities and tensions. For instance, a project grant generally only covers part of the costs of the research activities and requires matching funds that might be found in the block funding for university research (often under the form of research staff time). Implementing the project also requires services and equipment financed through past and present institutional funding. Typically, institutional funding provides money to build and maintain basic capacity (i.e. skills and the work environment) and finance day-to-day operations, whereas project funding supports more targeted research (Lepori, Geuna and Mira, 2017). However, this traditional model is becoming blurred, as rules (not least concerning overheads and eligible expenses) are changing and vary among countries. As a result, making a clear-cut distinction between longer-term institutional funding and competitive-project respective contributions to the steering of research is even more difficult.

  • The institutional environment is essential to understanding the funding landscape. Some important parameters to consider are the existence, size and scope of funding agencies, and their type of relationships with ministries; the existence of “umbrella organisations”, to which government can delegate some programming and funding roles (e.g. the National Centre for Scientific Research [CNRS] in France); and universities’ internal organisation (e.g. the internal funding-allocation mechanisms) and strategic management capabilities.

Research funding is a complex, staged and multifaceted issue, which calls for a systemic view in order to understand its dynamics and assess its effects. The “In my view” box below provides some guidelines to pursue a holistic analysis of research funding.

Emerging or ongoing trends are already changing funding practices and landscapes; the future evolutions of research funding are therefore uncertain. With the growing importance of innovation in all human activities, the pressure will grow for research to deliver workable solutions to real-world problems. A likely scenario is that research will continue to evolve as a demand-driven activity, favouring mechanisms in which research users – rather than researchers alone – increasingly shape the research agenda. Such an evolution could not only promote competitive mechanisms, but also different forms of institutional funding that steer research. It could also result in a multiplication of the expected objectives underlying any research activity, as shown in the growing list of project-evaluation criteria and the expanding formula for performance-based institutional funding. This trend could jeopardise the ability of a given research project to excel in a specific dimension, e.g. scientific excellence, high-risk research or economic/social relevance. The modes of research support will most likely continue to evolve to deal with this issue, either by segmenting funding according to types of objectives or creating new modes of “customised” project evaluation.

The growing recognition of the Sustainable Development Goals (SDGs) as challenges to be addressed in research and innovation is a salient trend (Chapter 4 on the SDGs). The literature has widely documented that research relevant to SDGs will need to be transformational, hence ambitious, interdisciplinary and performed with a mid- to long-term horizon. While this does not in principle imply project-based funding, the pressure for greater accountability and cost efficiency will clearly favour competitive-funding approaches. Designing new instruments and programmes (such as different forms of mission-oriented programmes) will be key to juggling the competing requirements of strategic steering, competitive allocation and risk-taking.

The articulation between instrument design and policy objectives is also changing as digitalisation transforms the research and innovation enterprise (as evidenced in this Outlook). Digitalisation is improving the ability of policymakers and funders to monitor research: more up-to-date information is available, which can be analysed more in-depth, hence facilitating evaluation (see Chapter 12 on digital science and innovation policy). Information useful for resource allocation could be accessed directly through data processing, reducing the need for costly competitions. At the same time, digitalisation can lower the cost of competitive funding (project-preparation work can be subjected to versioning and re-used, and panels can be organised online), which could enhance its appeal.

Needless to say, research to address the SDGs and/or reap the opportunities of digitalisation will require ever-increasing financial inputs, in the context of the rising costs of research and budget pressure in indebted states. Tensions over budgetary negotiations will undoubtedly grow between policy fields. Research – which is both an increasingly costly policy field and a key enabler of the transformational agenda – will be at the heart of these debates.

References

Bloom, N. et al. (2017), “Are Ideas Getting Harder to Find?”, NBER Working Paper, No. 23782, National Bureau of Economic Research, Cambridge, MA, http://www.nber.org/papers/w23782.

Butler, L. (2010), “Impacts of performance-based research funding systems: A review of the concerns and the evidence”, in Performance-Based Funding for Public Research in Tertiary Education Institutions: Workshop Proceedings, OECD Publishing, Paris, pp. 127-165, https://doi.org/10.1787/9789264094611-en.

Dialogic and Empirica (2014), “The effectiveness of national research funding systems”, Dialogic and Empirica, Utrecht/Bonn, https://www.dialogic.nl/wp-content/uploads/2016/12/2013.109-1422.pdf.

EC/OECD (2017), STIP Compass: International database on STIP policies (database), April 2018 version, https://stip.oecd.org.

Goldstein, A.P. and V. Narayanamurti (2018), Simultaneous pursuit of discovery and invention in the US Department of Energy, Research Policy, Vol. 47, pp. 1505-1512, Elsevier, Amsterdam, https://doi.org/10.1016/j.respol.2018.05.005.

Heinze, T. (2008), “How to sponsor ground-breaking research: A comparison of funding schemes”, Science and Public Policy, Vol. 35/5, pp. 302-318, Oxford Academic Press, Oxford, https://doi.org/10.3152/030234208X317151.

Hicks, D. (2010), “Overview of models of performance-based research funding systems", in Performance-Based Funding of Public Research in Tertiary Education Institutions, OECD Publishing, Paris, https://doi.org/10.1787/9789264094611-en.

Hornyak T. (2017), “Japan shakes up research funding system”, Nature Index, 1 August 2017, Springer Nature, https://www.natureindex.com/news-blog/japan-shakes-up-research-funding-system.

Jongbloed, B. and B. Lepori (2015), “The Funding of Research in Higher Education: Mixed Models and Mixed Results”, in Huisman, J. et al., The Palgrave International Handbook of Higher Education Policy and Governance, pp. 439-462, Palgrave Macmillan, London.

Jonkers, K. and T. Zacharewicz (2016), Research Performance Based Funding Systems: a Comparative Assessment, European Commission, Publications Office of the European Union, Luxembourg, http://publications.jrc.ec.europa.eu/repository/bitstream/JRC101043/kj1a27837enn.pdf.

Laudel, G. and J. Gläser (2014), “Beyond breakthrough research: Epistemic properties of research and their consequences for research funding”, Research Policy, Vol. 43, pp. 1204-1216, Elsevier, Amsterdam, https://doi.org/10.1016/j.respol.2014.02.006.

Lepori, B., E. Reale and A. Orazio Spinello (2018), “Conceptualizing and measuring performance orientation of research funding systems”, Research Evaluation, Vol. 1/13, Oxford University Press, Oxford, https://doi.org/10.1093/reseval/rvy007.

Lepori B., A. Geuna and A. Mira (2017), “Money matters, but why? Distribution of resources and scaling properties in the US and European higher education”, Presentation at Leiden University, June 2017.

Lepori, B. et al. (2007), “Comparing the evolution of national research policies: What patterns of change?”, Science and Public Policy, Vol. 34/6, pp. 372-388, Oxford University Press, Oxford, https://doi.org/10.3152/030234207X234578.

Mazzucato, M. (2018), Mission-Oriented Research & Innovation in the European Union – A problem-solving approach to fuel innovation-led growth, Directorate-General for Research and Innovation, European Commission, Publications Office of the European Union, Luxembourg, https://ec.europa.eu/info/sites/info/files/mazzucato_report_2018.pdf.

Matsuo, K. (2018), “The structure and issues in Japan’s STI funding”, Presentation at the Euroscience Open Forum (ESOF) Conference, Toulouse, 11 July 2018.

OECD (forthcoming a), Effective Operation of Competitive Research Funding Systems, OECD Publishing, Paris.

OECD (forthcoming b), OECD Reviews of Innovation Policy: Austria 2018, OECD Publishing, Paris.

OECD (2018a), Main Science and Technology Indicators (database), https://www.oecd.org/sti/msti.htm (accessed on accessed on 25 June 2018).

OECD (2018b), Research and Development Statistics, database, http://www.oecd.org/innovation/inno/researchanddevelopmentstatisticsrds.htm (accessed on 25 June 2018).

OECD (2017a), OECD Science, Technology and Industry Scoreboard 2017: The digital transformation, OECD Publishing, Paris, https://doi.org/10.1787/9789264268821-en.

OECD (2017b), OECD Reviews of Innovation Policy: Norway 2017, OECD Publishing, Paris. https://doi.org/10.1787/9789264277960-en.

OECD (2016), OECD Reviews of Innovation Policy: Sweden 2016, OECD Publishing, Paris, https://doi.org/10.1787/9789264249998-en.

OECD (2015a), The Innovation Imperative: Contributing to Productivity, Growth and Well-Being, OECD Publishing, Paris, https://doi.org/10.1787/9789264239814-en.

OECD (2015b), Frascati Manual 2015: Guidelines for Collecting and Reporting Data on Research and Experimental Development, The Measurement of Scientific, Technological and Innovation Activities, OECD Publishing, Paris, https://doi.org/10.1787/9789264239012-en.

OECD (2014a), Promoting Research Excellence: New Approaches to Funding, OECD Publishing, Paris, https://doi.org/10.1787/9789264207462-en.

OECD (2014b), OECD Reviews of Innovation Policy: Netherlands 2014, OECD Reviews of Innovation Policy, OECD Publishing, Paris, https://doi.org/10.1787/9789264213159-en.

OECD (2011), Public Research Institutions: Mapping Sector Trends, OECD Publishing, Paris, https://doi.org/10.1787/9789264119505-en.

Reale, E. (2017), “Analysis of National Public Research Funding (PREF) – Final Report”, JRC Technical report, European Commission, Publications Office of the European Union, http://publications.jrc.ec.europa.eu/repository/bitstream/JRC107599/kj0117978enn.pdf.

van Steen, J., (2012). “Modes of Public Funding of Research and Development: Towards Internationally Comparable Indicators”, OECD Science, Technology and Industry Working Papers, Vol. 2012/04, OECD Publishing, Paris, https://doi.org/10.1787/5k98ssns1gzs-en.

Wang J., Y.-N. Lee and J.P. Walsh (2018), “Funding model and creativity in science: Competitive versus block funding and status contingency effects”, Research Policy, Vol. 47, pp. 1070-1083, Elsevier, Amsterdam, https://doi.org/10.1016/j.respol.2018.03.014.

Zdravkovic, M and B. Lepori (2018), “Mapping European Public Research Funding Studies: Selected results and some open questions”, Presentation at the EU-SPRI conference, 7 June, ESIEE, Marne-la-Vallée.

Notes

← 1. As shown, for instance, in the analysis of responses to the questions on the main public-research policy debates in the 2017 edition of the EC-OECD STIP survey, covering more than 50 countries (EC/OECD, 2017). See also Zdravkovic and Lepori (2018) for an analysis of the academic literature.

← 2. Including 21 OECD of 36 member countries.

← 3. The OECD Frascati Manual defines GUF as the share of R&D funding from the general grants universities receive from the central government (federal) ministry of education or the corresponding provincial (state) or local (municipal) authorities to support their overall research/teaching activities (OECD, 2015b).

← 4. Part of the country differences relate to the relative weights of research activities performed in HEIs and PRIs.

← 5. Similar criticisms can be also directed towards some forms of performance-based institutional funding.

Legal and rights

This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Extracts from publications may be subject to additional disclaimers, which are set out in the complete version of the publication, available at the link provided.

© OECD 2018

The use of this work, whether digital or print, is governed by the Terms and Conditions to be found at https://www.oecd.org/termsandconditions.