copy the linklink copied!Chapter 5. Building stronger foundations to evaluate national education performance

Serbia has established some of the basic components of system evaluation. However, the lack of a national assessment of student learning and a fully functioning education management information system (EMIS) system leaves Serbia without an adequate evidence base to guide and monitor policy reforms, making it difficult to understand the main issues stalling educational improvement. This chapter recommends that Serbia focus its new post-2020 education strategy on key national priorities that can improve teaching and learning. In particular, the country should carefully design and implement the new national assessment and encourage policymakers to access and interpret administrative and assessment data when developing education policies. This can help Serbia address systemic issues and lead to a better understanding of where and why students are falling behind in their learning, despite high levels of school participation.

    

copy the linklink copied!Introduction

System evaluation is central to improving educational performance. It holds the government and other stakeholders accountable for meeting national goals and provides information that can help develop effective policies. Serbia has established some of the basic components of system evaluation. For example, a national education strategy provides a reference for planning and the Ministry of Education, Science and Technological Development (hereafter the ministry) works with external partners, such as universities, to conduct research and evaluations. There is some capacity within the ministry and technical agencies to identify national education challenges and evaluate policies. However, Serbia generally struggles to make information about public sector performance widely available (OECD, 2017[1]). In the education system, this is partly because of important gaps in the evaluation infrastructure. Specifically, lack of a national assessment of student learning and a fully functioning EMIS system leaves Serbia without an adequate evidence base to guide and monitor policy reforms, making it difficult to understand the main issues stalling educational improvement.

This chapter recommends several measures that can help Serbia build stronger foundations for system evaluation. This will be crucial as Serbia works towards developing its new post-2020 education strategy. In particular, it is important that Serbia carefully design and implement its new national assessment, which can provide valuable information about the extent to which students are meeting national learning standards. Encouraging policymakers to access and interpret administrative and assessment data when developing education policies can help further address systemic issues and lead to a better understanding of where and why students are falling behind in their learning, despite high levels of school participation. Aligning these reforms can help the Serbian education system improve from a good regional performer to an excellent one.

copy the linklink copied!Key features of effective system evaluation

System evaluation refers to the processes that countries use to monitor and evaluate the performance of their education systems (OECD, 2013[2]). A strong evaluation system serves two main functions: to hold the education system, and the actors within it, accountable for achieving their stated objectives; and, by generating and using evaluation information in the policymaking process, to improve policies and, ultimately, education outcomes (see Figure 5.1). System evaluation has gained increasing importance in recent decades across the public sector, in part because of growing pressure on governments to demonstrate the results of public investment and improve efficiency and effectiveness (Schick, 2003[3]).

In the education sector, countries use information from a range of sources to monitor and evaluate quality and track progress towards national objectives (see Figure 5.1). As well as collecting rich data, education systems also require “feedback loops” so that information is fed back into the policymaking process (OECD, 2017[4]). This ensures goals and policies are informed by evidence, helping to create an open and continuous cycle of organisational learning. At the same time, in order to provide public accountability, governments need to set clear responsibilities – to determine which actors should be accountable and for what – and make information available in timely and relevant forms for public debate and scrutiny. All of this constitutes a significant task, which is why effective system evaluation requires central government to work across wider networks (Burns and Köster, 2016[5]). In many OECD countries, independent government agencies, such as national audit offices, evaluation agencies, the research community and subnational governments, play a key role in generating and exploiting available information.

A national vision and goals provide standards for system evaluation

Like other aspects of evaluation, system evaluation must be anchored in national vision and/or goals, which provide the standards against which performance can be evaluated. In many countries, these are set out in an education strategy that spans several years. An important complement to national vision and goals are targets and indicators. Indicators are the quantitative or qualitative variables that help to monitor progress (World Bank, 2004[6]). Indicator frameworks combine inputs like government spending, outputs like teacher recruitment and outcomes like student learning. While outcomes are notoriously difficult to measure, they are a feature of frameworks in most OECD countries because they measure the final results that a system is trying to achieve (OECD, 2009[7]). Goals also need to balance the outcomes a system wants to achieve, with indicators for the internal processes and capacity throughout the system that are required to achieve these outcomes (Kaplan and Norton, 1992[8]).

Reporting against national goals supports accountability

Public reporting of progress against national goals enables the public to hold the government accountable. However, the public frequently lacks the time and information to undertake this role and tends to be driven by individual or constituency interests rather than broad national concerns (House of Commons, 2011[9]). This means that objective and expert bodies, such as national auditing bodies, parliamentary committees and the research community, play a vital role in digesting government reporting and helping to hold the government to account.

An important vehicle for public reporting is an annual report on the education system (OECD, 2013[2]). In many OECD countries, such a report is now complemented by open data. If open data is to support accountability and transparency, it must be useful and accessible. Many OECD countries use simple infographics to present complex information in a format that the general public can understand. Open data should also be provided in a re-usable form, i.e. for other users to download and use in different ways so that the wider evaluation community, such as researchers and non-governmental bodies, can analyse data to generate new insights (OECD, 2018[10]).

National goals are a strong lever for governments to direct the education system

Governments can use national goals to give coherent direction to education reform across central government, subnational governance bodies and individual schools. For this to happen, goals should be clear, feasible and above all, relevant to the education system. Having a clear sense of direction is particularly important in the education sector, given the scale, multiplicity of actors and the difficulty in retaining focus in the long-term process of achieving change. In a well-aligned education system, national goals are embedded centrally in key reference frameworks, encouraging all actors to work towards their achievement. For example, national goals linked to all students reaching minimum achievement standards or to teaching and learning fostering student creativity are reflected in standards for school evaluation and teacher appraisal. Through the evaluation and assessment framework, actors are held accountable for progress against these objectives.

copy the linklink copied!
Figure 5.1. System evaluation
Figure 5.1. System evaluation

Tools for system evaluation

Administrative data about students, teachers and schools are held in central information systems

In most OECD countries, data such as student demographic information, attendance and performance, teacher data and school characteristics are held in a comprehensive data system, commonly referred to as an education management information system (EMIS). Data are collected according to national and international standardised definitions, enabling data to be collected once only, used across the national education system and reported internationally. An effective EMIS also allows users to analyse data and helps disseminate information about education inputs, processes and outcomes (Abdul-Hamid, 2014[11]).

National and international assessments provide reliable data on learning outcomes

Over the past two decades, there has been a major expansion in the number of countries using standardised assessments. The vast majority of OECD countries (30) and an increasing number of partner countries have regular national assessments of student achievement for at least one level of the school system (OECD, 2015[12]). This reflects the global trend towards greater demand for outcomes data to monitor government effectiveness, as well as a greater appreciation of the economic importance of all students mastering essential skills.

The primary purpose of a national assessment is to provide reliable data on student learning outcomes that are comparable across different groups of students and over time (OECD, 2013[2]). Assessments can also serve other purposes such as providing information to teachers, schools and students to enhance learning and supporting school accountability frameworks. Unlike national examinations, they do not have an impact on students’ progression through grades. When accompanied by background questionnaires, assessments provide insights into the factors influencing learning at the national level and across specific groups. While the design of national assessments varies considerably across OECD countries, there is a consensus that having regular, reliable national data on student learning is essential for both system accountability and improvement.

An increasing number of countries also participate in international assessments such as the OECD Programme for International Student Assessment (PISA) and the two programmes of the International Association for the Evaluation of Educational Achievement (IEA), Trends in International Mathematics and Science Study (TIMSS) and Progress in International Reading Literacy Study (PIRLS). These assessments provide countries with periodic information to compare learning against international benchmarks as a complement to national data.

Thematic reports complement data to provide information about the quality of teaching and learning processes

Qualitative information helps to contextualise data and provide insights into what is happening in a country’s classrooms and schools. For example, school evaluations can provide information about the quality of student-teacher interactions and how a principal motivates and recognises staff. Effective evaluation systems use such findings to help understand national challenges – such as differences in student outcomes across schools.

A growing number of OECD countries undertake policy evaluations

Despite an increased interest across countries in policy evaluations, it is rarely systematic at present. Different approaches include ex ante reviews of major policies to support future decision-making and evaluation shortly after implementation (OECD, 2018[13]). Countries are also making greater efforts to incorporate evidence to inform policy design, for example, by commissioning randomised control trials to determine the likely impact of a policy intervention.

Effective evaluation systems require institutional capacity within and outside government

System evaluation requires resources and skills within ministries of education to develop, collect and manage reliable, quality datasets and to exploit education information for evaluation and policymaking purposes. Capacity outside or at arms-length from ministries is equally important and many OECD countries have independent evaluation institutions that contribute to system evaluation. Such institutions might undertake external analysis of public data or be commissioned by the government to produce annual reports on the education system and undertake policy evaluations or other studies. To ensure that such institutions have sufficient capacity, they may receive public funding but their statutes and appointment procedures ensure their independence and the integrity of their work.

copy the linklink copied!System evaluation in Serbia

Serbia has some of the basic components that are integral to performing system evaluation. For example, a national education strategy provides a reference for planning and the ministry, along with specialised technical bodies, collects valuable data and has some capacity for policy evaluation. Nevertheless, there are major gaps in terms of system evaluation tools. In particular, the lack of a national assessment and a low-functioning EMIS system limits Serbia’s ability to conduct analysis and provide timely information about the performance of the education system. This contributes to a relatively underdeveloped culture of public reporting and information sharing. Without such tools and processes for system evaluation, public accountability becomes a challenge and the impetus to improve the education system fades. Table 5.1 shows some of the components and main gaps for system evaluation in Serbia.

copy the linklink copied!
Table 5.1. System evaluation in Serbia

References for national goals and vision

Tools

Body responsible

Outputs

  • The Strategy for Education Development in Serbia 2020

  • European Union (EU) 2020 goals for education and training

Administrative data

Statistical Office of the Republic of Serbia (SORS)

Ministry of Education, Science and Technological Development, Sector for Digitalisation in Educational Science

Regular statistical releases

Unified Information System of Education (UISE), Dositej platform, eClass register

National assessment

Institute for Education Quality and Evaluation (IEQE’s) Centre for International and National Assessments and Research and Development

Under development

International assessments

Until recently, the University of Belgrade was responsible for PISA and the Institute for Educational Research was responsible for TIMSS. Now, all international assessments are the responsibility of the Institute for Education Quality and Evaluation (IEQE)

National reports on:

  • PISA (age 15) mathematics; science; reading

  • TIMSS (Grade 4) mathematics and science

School evaluations

Institute for Education Quality and Evaluation (IEQE)

Annual report on school evaluations

Policy evaluations

Institute for Education Quality and Evaluation (IEQE)

Ad-hoc policy evaluations in response to ministry requests

Reports and research

Institute for Education Quality and Evaluation (IEQE)

International partners (EU and donor agencies)

No overall report on the education system; various specialised agencies report ad-hoc situation analysis, feasibility studies and evaluation exercises

High-level documents provide a clear vision for the education system

The 2020 Education Strategy marks a step change in policymaking

In 2012, the Ministry of Education, Science and Technological Development (hereafter the ministry) adopted the Strategy for Education Development in Serbia 2020 (hereafter the strategy). The document includes four broad objectives for education (see Chapter 1) that aim to provide a foundation for the economic, social, scientific, technological and cultural development of individuals, society and the Serbian state. The strategy represents continued efforts to move Serbia’s education system away from a culture based on political negotiations over legislation towards one that draws on evidence, aligns with national goals and can better support public accountability.

The ministry established a Project Unit in 2011 to develop the strategy. The unit was led by two external researchers and engaged more than 200 renowned experts to help analyse the state of education in Serbia and set out a comprehensive vision for developing the sector from pre-school to adult education (MoESTD, 2012[14]). The strategy also underwent a one-month public consultation process. However, there is some evidence that public consultation in Serbia does not generally enable all interested parties to provide timely and qualitative input (European Commission, 2018[15]).

Serbia’s education strategy includes a diagnosis of the country’s strengths, weaknesses, opportunities and threats (SWOT) across the sector. It also offers some quantitative targets that align with those established by the European Union’s (EU) 2020 Strategy (see Box 5.1). The abundance of goals and targets interspersed throughout the extensive strategy document provide great ambitions for Serbia’s education system but little prioritisation of what issues are most important for driving improvement. In 2015, Serbia adopted an action plan to support the strategy’s implementation and a special working group within the ministry prepared a progress report in 2018. However, this report was mainly descriptive and offered no recommendations about where efforts should be prioritised to improve teaching and learning (MoESTD, 2018[16]). The ministry also reports that such evaluations are not usually available to the public (MoESTD, 2018[17]).

copy the linklink copied!
Box 5.1. Selection of targets included in Serbia’s 2020 Education Strategy

Some of the quantitative targets included in the Strategy for Education Development in Serbia 2020 align with the European Union’s 2020 Strategy, in particular benchmarks around enrolment in higher education and participation in adult learning programmes. A selection of key targets from Serbia’s strategy include:

  • At least 98% of enrolment in primary education and drop-out rate no higher than 5%.

  • At least 95% of those who complete primary school enrol in secondary school.

  • At least 95% of those enrolled in four-year secondary vocational schools complete it.

  • At least 50% of the total student cohort enrols in higher education institutions.

  • At least 7% of the population follow one of the programmes dedicated to adult education and lifelong learning.

Noticeably, the Serbian 2020 strategy does not include benchmarks related to underperformance in reading, mathematics and science, or the share of employed graduates (individuals aged 20-34 who completed at least upper secondary education and left education 1-3 years ago), which are high-level standards set by the EU.

Sources: European Commission (n.d.[18]), European Policy Cooperation (ET 2020 framework), https://ec.europa.eu/education/policies/european-policy-cooperation/et2020-framework_en (accessed on 8 July 2019); MoESTD (2012[14]), Strategy for Education Development in Serbia 2020, http://erasmusplus.rs/wp-content/uploads/2015/03/Strategy-for-Education-Development-in-Serbia-2020.pdf.

Discussions about the new education strategy are underway

Serbia’s current education strategy will end in 2020. As such, the ministry has started discussions about the contents for a new medium-term strategy that will outline the country’s vision for education to 2030. The new strategy will cover a critical period for Serbia’s national development and potential accession to the EU, highlighting the importance of directing the education sector towards supporting more students to achieve good and excellent outcomes. The new strategy aims to build on the strengths of Serbia’s 2020 strategy, namely to consult with a range of stakeholders and undertake a strategic review of the system’s key strengths, challenges, opportunities and threats. However, the ministry also aims to make the next strategy more achievable by narrowing its focus and considering the resources needed for implementation. Such efforts would help ensure the new strategy prioritises key education issues and guides efforts to drive improvement.

Action plans do not provide clear goals nor precise targets

Serbia’s action plan for the implementation of the education strategy consists of three distinct parts. Respectively, these address pre-university education, higher education and a cross-cutting education development strategy. All of the action plans set out activities, implementation methods, deadlines, key actors, indicators of progress, resourcing needs, as well as procedures for monitoring, evaluation and reporting. The action plans specifically call for the use of a special electronic database and a ministry-appointed working group to support overall monitoring of the strategy (MoESTD, 2015[19]). However, the special working group was only developed in 2018, six years after the strategy was adopted, and there is no electronic database to monitor the strategy’s implementation.

In addition to underdeveloped monitoring processes, Serbia’s action plans do not clearly align with the goals and targets that are interspersed throughout the ambitious strategy document. For example, while both the strategy and action plan express the goal of reducing the primary school drop-out rate, only the former sets a clear target of having no more than 5% of primary students drop out by 2020 and identifies specific groups of students at risk of doing so (MoESTD, 2012[14]). Moreover, some of the activities and implementation steps in the action plans do not address important parts of the implementation process. Developing a final exam system at the end of secondary education, for example, does not include key actions such as piloting the new exam or sensitising schools and students about how the new exam will operate (see Table 5.2). While Serbia may take these actions to benefit from donor funding, such as the EU Instrument for Pre-accession Assistance (IPA), the vagueness of the national action plans for education hinders system monitoring and evaluation (see Recommendation 5.1.2).

The lack of clear goals, targets and actions offer limited guidance on what education actors should be working towards to help improve the quality and equity of Serbia’s education system. This can lead to the fragmentation of efforts and undermined accountability as directing action and communicating performance become more difficult without clear benchmarks. Moreover, policymakers are not required to investigate or explain why certain goals and targets were not met, presenting another challenge to accountability. It will likely remain difficult for Serbia to implement its education strategies and action plans without relevant and reliable sets of indicators to help guide and measure progress.

Emphasis on results in public financial management is limited

Serbia’s action plans for education include information about the funding required for various activities. While this review did not look specifically at how the education budget is negotiated and allocated, other OECD analysis finds that government budgets in Serbia are not prepared on the basis of strategic plans or systematic analysis of programmes to encourage discipline (OECD, 2017[1]). Moreover, the review team was informed that new legislation in Serbia is assessed as having “no cost for implementation”. This means that laws are approved by the government and parliament but then face major implementation challenges as there is no discussion of cost implications. These processes for public financial management do not encourage policymakers or line ministries to exercise fiscal discipline and focus when developing long-term reforms. As such, Serbia’s education strategy and action plans may not be financially viable and the pressure to achieve system goals is reduced because funding is not linked to planning or performance.

Donor funding has helped fill some of the resource gaps in Serbia’s education sector. However, since the ministry and central government does not adequately prioritise, plan or provide sufficient resources for actions, many important reforms and policies wane or have been discontinued. For example, the ministry did not take ownership of the new master’s programme for school leadership when EU funding ended in 2016/17 (see Chapter 4). As a result, enrolment in the programme dropped significantly, partly because aspiring principals were left to pay for courses out of pocket and given little incentive to do so. Serbia’s experience with national assessment provides another example of poor strategic planning. Previously, national assessments were financed by donors on an ad-hoc basis. However, lack of government funding to carry out these exercises in the medium to long term helps to explain why Serbia has not had a regular national assessment since 2006 (World Bank, 2012[20]).

Prospect of EU accession creates a demand for system evaluation

Alongside Montenegro, Serbia is one of the two Western Balkan countries most advanced in the European Union accession process. As an accession country, Serbia benefits from the EU Instrument for Pre-accession Assistance (IPA). This has provided significant financial and technical resources to support important education reforms, including the development of tools that can support system evaluation, such as a new national assessment. The prospect of EU membership has also become an important framing objective and is helping to raise expectations for system improvement in line with European standards. For example, in line with EU 2020 goals, Serbia has committed to reducing its share of early school leavers and increasing the share of 30-34 year-olds that have completed tertiary education to at least 40% by 2020 (Eurostat, 2019[21]). These new tools and expectations have put pressure on the government to improve system evaluation processes for more results-oriented monitoring, planning and accountability.

Tools for system evaluation are not fully developed

Serbia has some of the institutions and processes required to gather information and monitor the performance of the education system. However, there are challenges around national data collection and there was no national assessment of student learning between 2006 and 2018. This means that Serbia’s only external sources of information about learning outcomes are sample-based international assessments, such as PISA and TIMSS, and the final examination that students take at the end of compulsory schooling (see Chapter 2). Serbia’s situation contrasts with other countries in the region, which have managed to develop tools for system evaluation more fully. Improving teaching and learning outcomes will require more sophisticated tools to measure inputs, outputs and outcomes of the education system.

Efforts are underway to modernise data collection and management

Serbia’s 2020 Education Strategy set out a series of measures to modernise the country’s EMIS. Some of these measures have already been achieved or are being implemented. For example, the ministry introduced the Dositej platform in 2016 which provides an interface for schools to directly enter administrative data into a secure online database, rather than through paper or electronic forms that must be aggregated at the central level. Another innovation is the eClass Register (eDnevnik in Serbian), which the ministry introduced in 2019 to make enrolment and reporting of classroom data more efficient and available to parents. While this tool could lead to interesting studies at the regional or national level, it is not currently integrated with the Dositej platform, making it difficult for researchers to analyse information across the two databases (e.g. to design an early warning system for drop-out). The ministry plans to start linking various databases by introducing a unique educational number (UNI) for students. This will make it possible to track an individuals’ progression through the system and analyse education inputs, processes and outcomes. The ministry has made less progress in defining relationships among statistical bodies, harmonising methodologies and using international data standards.

Administrative data collection does not follow unified procedures

There are currently two key bodies that collect and manage education data in Serbia. The first is the Statistical Office of the Republic of Serbia (SORS) which collects and processes statistical data for national and international reporting in a variety of fields. These include (among others) the economy, finance, agriculture and regional policy. In regards to the education sector, the SORS collects administrative data on the number of schools, classes and students at the beginning and end of the school year and the number of teacher working hours. Some of this data can be disaggregated by gender and minority language but no data can be used as a proxy for socioeconomic background. The SORS also manages the DevInfo database, which was developed in 2004 to help the government monitor human development, support planning and facilitate reporting (MoESTD, 2018[17]). The DevInfo database includes education indicators, such as literacy rates and public expenditure on education.

The second body responsible for data collection and management is the Serbian education ministry through its Unified Information System of Education (UISE). The UISE was introduced in the early 2000s as the ministry’s official EMIS. While it also collects and stores administrative data about the education system, the ministry’s UISE manages a more comprehensive list of indicators than the SORS does. The type of data stored in the UISE and how it should be used, updated and kept secure is regulated by the national education law (Law on the Foundations of Educational System, LFES) (MoESTD, 2018[17]). However, despite political discussions on the system, relevant bylaw regulations that set out detailed procedures for collecting and managing data are just now being developed, which currently prevents the UISE from operating at its full potential. Moreover, staff turnover within the ministry has made it difficult to develop further and improve the UISE system.

To collect administrative data, both bodies rely on educational institutions from the pre-primary to the tertiary level, which are required to respond to various information questionnaires. Importantly, the data collected and reported by the SORS follows international definitions and procedures while the ministry’s UISE does not. For example, to collect information about educational attainment, the SORS calculates the average number of students enrolled at the beginning of a school year, minus the number of students enrolled at the end of a school year. The ministry, on the other hand, uses its own definitions to calculate attainment and there are no protocols to ensure the quality of the data collected. There were attempts in 2016 to create a national strategy on education statistics between ministry and the SORS to ensure that all data be collected according to standard definitions; however, this was never realised. Having two parallel data collection and management systems not only prevents Serbia from establishing a unified source of reliable information about its education system but also creates an unnecessary reporting burden for schools.

A pilot national assessment has been introduced

In 2016, a World Bank functional review of Serbia’s education sector highlighted the importance of measuring learning outcomes at the school level on a regular basis and improving administrative data to support key educational reforms (World Bank, 2016[22]). While this has been a major limitation in evaluating and improving educational quality at the system level, Serbia is starting to address this by developing a new national assessment. The lack of a national assessment sharply contrasts with the majority of EU and OECD countries, which administer some form of national assessment to measure student learning (OECD, 2013[2]). Serbia’s last national assessment was administered in 2006 to a sample of students in their final year of primary school (Grade 4). Since then, results from the end of basic education exam (Grade 8) have been used as the only national tool for monitoring student learning outcomes. However, the structure of the final exam gives a very limited understanding of students’ skills and development (see Chapter 2). Moreover, the “combined test” assesses several subjects at once, limiting its relevance for analysing individual subjects. As a result, Serbia has little information about learning outcomes during transition years. This is problematic since there is a general concern that performance tends to decrease when students move from classroom teachers to subject teachers (starting in Grade 5).

In 2017/18, Serbia piloted a new national assessment for students in Grades 7 (basic education) and 11 (upper secondary). Serbia decided to administer the assessment in Grade 11 in order to pilot test questions for the new Matura, which is under development and will be used to certify graduation from secondary school and inform selection into tertiary education (see Chapter 2). The OECD understands that, in the future, the assessment will be administered in Grade 6, though no fixed cycle has yet been established. The sample-based pilot was administered in paper-and-pencil format and tested students’ knowledge in mathematics, physics and history (MoESTD, 2018[17]). Four interdisciplinary subjects were also assessed, including: problem-solving; digital competency; tolerance, entrepreneurship and responsibility towards the environment. To better understand the conditions in which the learning process takes place, the pilot was accompanied by a ten-page background questionnaire for students, teachers and school principals.

Serbia’s Institute for Education Quality and Evaluation (IEQE) was responsible for the overall plan and design of the pilot assessments; however, the ministry, its regional units and external associates (i.e. experienced teachers) were responsible for their implementation. Results from the pilot national assessment are expected in 2019 and there are plans to report the findings in three formats: an internal report for the ministry, a public report for education actors (e.g. schools and training providers), and a third summative report for the public. This experience serves as a strong foundation for Serbia to fully implement its new national assessment.

Participation in international assessments is somewhat irregular

Serbia regularly participates in the International Association for the Evaluation of Educational Achievement (IEA)’s TIMSS, though at different grade levels. In the 2003 and 2007 cycles, only students in Grade 8 took the TIMSS assessment and since 2011, only students in Grade 4 have participated. Serbia also regularly participated in PISA between 2006 and 2012 and participated again in PISA 2018. The latter was the first time students in Serbia took the PISA assessment using computers rather than pencil and paper.

The administration of large-scale international assessments in Serbia was previously managed by the University of Belgrade (PISA) and the Institute for Educational Research (TIMMS). However, the IEQE was recently made responsible for all international assessments, in addition to managing national exams and developing the new national assessment. The experience of administering PISA and TIMSS will help the IEQE to develop its capacity to administer large-scale assessments of student learning. However, this also places an additional workload on the IEQE and it is unclear if resources and technical expertise are being proportionally increased to ensure the institute can sufficiently meet the demands of these new responsibilities.

Evaluation and thematic reports

Thematic evaluations exist but there is no national analysis of the education system

Serbia’s national statistical office (SORS) prepares an annual statistical report on education that includes administrative data, such as the number of students across different levels of education, demographic information, completion and drop-out rates, and the number of teaching staff. Most technical agencies also prepare annual reports based on their programmes of work. For example, the IEQE develops an annual report summarising key findings from external school evaluations. This provides valuable information about how schools perform compared to school quality standards, the main challenges they face and recommendations for improvement (see Chapter 4). The IEQE also produces regular public reports on results from the final exam of compulsory education and international assessments.

Serbia has some thematic reports that can provide an important accountability function. However, this information has not been pulled together into a comprehensive report that evaluates the overall state of education. This makes it difficult to highlight the main system-level challenges and communicate policy priorities.

The IEQE leads the practice of evaluating policies and programmes

In addition to reporting on thematic areas of the education system, the IEQE also undertakes ad-hoc research at the request of the ministry and has established a practice of using evidence to inform education policy. For example, the IEQE organised two large-scale conferences and conducted statistical analysis over a five-year period to ensure that the new quality standards for schools and the indicators used to measure them provided clear definitions of good teaching and learning.

International donors drive thematic evaluations

On occasion, international donors provide valuable analysis of education issues in Serbia that contribute to system evaluation by providing thematic reports or policy evaluations. These exercises often consist of situation analysis and/or feasibility studies on specific education policies (MoESTD, 2018[17]). For example, the EU has commissioned studies related to inclusive education as well as other reform efforts. The United Nations Children's Fund (UNICEF) and the World Bank have also conducted analysis on education issues including early childhood education and care. While the work of external actors can provide important insights for system evaluation, it can also lead governments to focus on priorities that are determined by external actors and pay less attention to developing the capacity of national agencies.

Evaluation institutions

The Institute for Education Quality and Evaluation (IEQE) has a formal mandate for system evaluation

The IEQE is the main institution in Serbia with a formal mandate to evaluate the education system independently and carry out research for strategic planning purposes. The IEQE has four organisational units:

  1. 1. The Centre for Quality Assurance of Educational Institutions, which is responsible for developing education standards; developing standards and instruments for school evaluations; occasionally participating in external school evaluations; producing annual reports on school evaluations; and providing training on self-evaluation and student assessment.

  2. 2. The Exam Centre, which develops and manages Serbia’s two national examinations and produces periodic reports on results.

  3. 3. The Centre for International and National Assessments and Research and Development, which is responsible for research and evaluation and making recommendations on how the ministry can support system improvement based on analysis.

  4. 4. The Centre for Educational Technology, a relatively new organisational unit that is responsible for the application of new technologies in education.

The staff within the various IEQE units have significant technical expertise and are responsible for implementing two of Serbia’s major education reforms: the new Matura exam and the national assessment. However, capacity remains a challenge since less than half of the current staff are skilled education professionals and there is a lack of individuals with experience in quantitative research, statistical analysis, psychometrics and survey design (MoESTD, 2018[17]). Moreover, restrictions on hiring public service employees, low salaries and heavy workloads make it difficult to recruit and retain staff. As such, the IEQE sometimes commissions external experts or research institutions to help carry out the institute’s programme of work. Since the IEQE’s responsibilities are expanding (it is now also responsible for administering all international assessments), limited human and financial resources may jeopardise the institute’s ability to conduct system evaluation.

Evaluation and analytical capacity within the ministry is limited

There is limited capacity within the ministry to conduct system evaluation. The Group for Analytics was established as the evaluation and research arm of the ministry in 2014. Despite its position within the ministry’s Sector for Higher Education, the Group for Analytics was given a mandate to collect evidence and analyse policies across the whole education system (not just higher education). However, because of significant fluctuations in personnel, the ambitions of the group were never realised and its operations are currently managed by a single staff member (MoESTD, 2018[17]).

copy the linklink copied!Policy issues

The primary challenge to developing system evaluation in Serbia is the absence of clear high-level goals for the education system that are accompanied by precise targets. First and foremost, this review strongly recommends that Serbia use the opportunity of developing a new national education strategy to identify a clear set of priorities for the education system and create action plans and indicator frameworks to help drive system improvement. With system goals in place, the country can then work towards developing the high-quality data needed to monitor progress and promote more transparent and evidence-informed policymaking. This will involve strengthening procedures for data collection and addressing important information gaps, in particular in student learning outcomes. Finally, Serbia’s new national assessment can help better understand how students are performing and serve as a reference to improve teaching and learning.

copy the linklink copied!Policy issue 5.1. Using the new education strategy to focus on achieving national priorities

Serbia’s current education strategy is ambitious and extensive. The strategy document was informed by research about the performance of the education system and underwent a stakeholder consultation process. This allowed for a lengthy description of the various challenges facing the system. The document itself is over 230 pages long and its action plan, which was developed 4 years after the strategy was introduced, sets out around 157 different activities to be carried out across the education system. This has made it very difficult to drive system improvement since there is no prioritisation of what issues and actions are most important.

As Serbia works to develop its next medium-term strategy for the education sector, the ministry should focus on key national priorities that are supported by an action plan better designed to steer the implementation process. In particular, there should be greater alignment between the strategy and action plan, with specific goals and activities accompanied by measurable targets. Serbia will also need to establish a much stronger link between the strategy and resources. This was a considerable challenge for the current strategy, which was based on the education budget increasing to 6% gross domestic product (GDP) by 2020. However, in practice, education spending fell across the duration of the strategy.

Strategic education priorities need to be costed and action plans developed in agreement with the Ministry of Finance based upon a robust dialogue of the required and available funds. While the Serbian Ministry of Finance prepares annual budgets within a three-year medium-term framework, the timelines for preparing these are too tight for a proper assessment and debate of programmes (OECD, 2017[1]). Resource considerations should address investment in human and technical capacity to carry out evaluation processes and manage the instruments needed to support a more results-oriented, transparent and accountable planning cycle.

Recommendation 5.1.1. Identify national priorities for the new strategy

In Serbia, the current strategy’s multiplicity of objectives are difficult to distil into a small number of high priority goals that drive system improvement. Moreover, the progress indicators included in the action plan are not always relevant and lack specific targets. This presents a risk in terms of policy misalignment and uncoordinated initiatives. As Serbia works towards developing a new education strategy, national goals should be more specific and clearly expressed. They should also be accompanied by relevant and reliable indicators with precise targets to help monitor progress.

The first step in this process will be to determine what strategic issues should be prioritised. Evaluating the achievements of the 2020 strategy and triangulating this information with other evidence can help identify the most pressing issues facing the Serbian education system. Serbia will also need to think about what challenges the education system is likely to encounter in the future. Next, a clear set of meaningful goals that are easy to communicate across the education sector and society should be established to galvanise support for system improvement. Engaging the public, both during the strategy’s development and after its adoption, can help build consensus and understanding that these goals are national and urgent priorities, which transcend political factions and stand to benefit public interest. This can also help promote transparency and trust in education reform.

Evaluate the 2020 strategy and other evidence to prioritise key strategic issues

In Serbia, the 2020 Education Strategy is the highest-level strategic document that guides education activities. These activities include more than 124 different policies, actions and measures that are proposed to improve pre-university education. There are an even greater number of proposals for vocational and higher education. Efforts include defining a concept for a secondary graduation exam, introducing socially relevant elective courses and developing operational and quality standards for different types of early childhood education and care (ECEC) provision (MoESTD, 2012[14]). While such activities can lead to improvements, the lack of prioritisation about what is most important presents a major challenge for Serbia since it fails to direct the education system and galvanise support among various stakeholders.

A holistic evaluation of Serbia’s 2020 strategy would not only provide an account of progress made to improve the country’s education system but also offer insights into the successes and challenges of the current strategy, i.e. why some objectives were achieved while others were not. This evaluation could build on the strategy’s 2018 progress report; however, the new analysis should focus more on measuring progress against the strategy, on drawing conclusions from the evaluation and other key sources of evidence to prioritise strategic issues and on identifying specific goals and targets for the next strategy. The holistic evaluation could also assess the strategy and action plans themselves to better understand how they were perceived, understood and used by different stakeholders to provide insights into how the new strategy could be more operational. The ministry should task the IEQE to undertake the evaluation of the 2020 strategy since this body has the technical expertise required. Findings from the evaluation report should be made available to the public and parliament to support transparency and accountability. In turn, this report can feed into the consultation process for the next strategy.

Consider a range of evidence

In addition to drawing on findings from the evaluation of the 2020 strategy, Serbia should continue the practice of considering a wide range of evidence to develop its new education strategy. For example, information from national data sources, international benchmarks and research findings should be triangulated to decide what issues to address first. Serbia’s current education strategy recognises many of the challenges facing the country’s education system and offers foresight into the challenges the country is likely to face in the future; however, there could be more prioritisation. The current strategy also offers some benchmarks against regional peers and identifies areas for capacity development, in particular the need to develop and use education statistics more effectively. Nevertheless, there is a very limited discussion about what capacities are needed to better plan, deliver and evaluate education policies. As such, this review recommends that Serbia consider a range of evidence when identifying national education priorities, including what capacities should be developed to achieve the new strategy’s goals.

Identify key national goals for education

After strategic issues have been identified, a small set of high-level goals will need to be established. Internationally, countries use national goals and targets to give visibility to national priorities and direct the education system towards their achievement. The goals should be specific and balanced, considering both the outcomes a system wants to achieve, as well as the internal processes and capacity throughout the system required to achieve these outcomes (Kaplan and Norton, 1992[8]). In turn, the goals should be associated with measurable indicators and achievable targets that are clearly reflected in the new strategy’s action plan and monitoring framework (see Recommendation 5.1.2).

Considering the challenges Serbia faces in terms of improving its education system, this review strongly recommends the government establish goals to raise learning outcomes and improve educational equity. This would help to ensure that the education system and society in general recognise these as national and urgent priorities. For example, the goal of improving student learning outcomes might be measured by the new national assessment once it is fully implemented. In the meantime, Serbia could use data from international assessments, such as PISA, to monitor student performance and measure progress towards this goal. Reducing the share of low performers in PISA to below 15% by 2020, in line with the European Union (EU) target (European Commission, n.d.[18]), would serve as a good national target for this indicator. The government can also consider setting interim benchmarks to ensure that the country is progressing towards the long-term goal.

Undertake a national consultation to develop the new strategy

The 2020 strategy was developed in consultation with key stakeholders in the sector and informed by analysis from a large group of education experts. Continuing this practice will help raise the profile of the new strategy and build stakeholder buy-in for the newly established educational goals. To ensure the consultation process is efficient, the ministry should lead the strategy’s overall development but manage the consultation process in a way that is both inclusive and effective.

This should involve forming a representative stakeholder group that includes key actors across the system such as ministry officials, staff from technical bodies (IEQE and IIE), but also actors who may not have been included in consultations on the current education strategy, such as parents and students. A wide range of actors should be invited to provide direct feedback and suggest proposals to be included in the new strategy. The process might be time-bound (e.g. three to six months) to keep development of the strategy on track. This is important since long consultation processes may lead to stakeholder fatigue. However, public consultations should not end once the new strategy is adopted. In 2018, the EU found that Serbia had few public consultations on education and training regulations (European Commission, 2018[15]). Maintaining stakeholder engagement throughout the legislative development process and clearly communicating progress towards headline goals and targets can advance the implementation of the strategy and support accountability.

Recommendation 5.1.2. Develop action plans and a monitoring framework with measurable targets

Once Serbia has prioritised a set of strategic issues and identified clear national goals for education, it will be important to operationalise these goals through concrete actions and specific, measurable targets. The current education strategy includes a multiplicity of goals and some quantitative targets (see Box 5.1). For example, by 2020, the strategy aims to increase public funding for education to 6% of GDP, reduce the drop-out rate to 5% and have 50% of students who graduate from university continue their studies at the graduate level. However, these targets are not reflected in the action plans. Aligning the activities in the action plan with clear goals and measurable targets would help stakeholders to better understand what they are working towards and direct change. It would also help monitor the implementation process and communicate progress more effectively to promote greater transparency and accountability.

Create new action plans with specific actions and measurable outcomes

To make Serbia’s new education strategy more operational, the ministry should focus on specific actions with measurable outcomes. The action of “evaluating educational achievements of primary students”, for example, is measured by progress indicators including: the number and types of student educational achievements, results on educational achievements and number of programmes for the promotion of teacher competencies (in the areas of student assessment) (MoESTD, 2015[19]). Some of these indicators (such as the number of teacher education programmes) may not be the most relevant measures for evaluating student achievement. Tables 5.2 and 5.3 provide examples of action points from Serbia’s current plan and suggests ways in which these could be improved for the action plans associated with the new strategy. In developing substance for the new action plans, it will be important for the ministry to consider the following points:

  • Align actions with clear and specific goals. Some of the actions listed in the current strategy could serve as system goals, such as “reduction in drop-out rate during primary education”; however, others are less clear, such as “elaborating all the components of continuous teacher development and advancement”. While the former action plainly indicates what goal is trying to be achieved (lowering the drop-out rate), the latter does not as the desired outcome is not explicitly stated. Serbia’s new action plans should align actions with clear and specific goals so that actors know what they are working towards (the outcome). Desired outcomes could also be clearly stated and included in action plans.

  • Ensure actions are clear and specific. Similar to goals, actions and sub-actions should be operationally clear and specific. For example, one of the implementation activities for “elaborating components of teacher development” includes establishing “a fair, performance-based system of teacher evaluation”. However, this could be unpacked further to outline what specific steps are required to establish such a system. For instance, developing the tools and guidelines to build the capacity of advisors to undertake teacher appraisals is an example of a more specific action point that could be included in the new strategy to better support implementation.

  • Include an indication of timing and points of contact. Serbia already includes a timeline and points of contact for each action. This practice should be continued in the next strategy as it can help keep the implementation process on schedule and hold designated stakeholders accountable for specific actions. The ministry could also consider developing mid-term outcomes or milestones for the next strategy in order to monitor progress continuously. For example, a mid-term outcome of building the capacity to conduct teacher appraisal could be that advisors understand what makes for an effective appraisal and where they can receive further support.

  • Review progress indicators and assign clear targets. While the 2020 strategy has some clear targets, these should be reflected in the action plans to help track progress towards the national education goals. Serbia could also add indicators related to the types of processes and capacities that need to be developed to achieve national goals, defining what success would look like for stakeholders (i.e. outcomes). For example, there was no mention of the need to build the capacity of advisors to carry out teacher appraisals; this is, however, an important progress indicator that could be assigned clear targets.

  • Identify and plan for resource needs. For the action plan to be financially viable, the issues addressed must be sufficiently important, produce desirable results at a reasonable cost and have stability (Bryson, 2018[23]). This requires a constructive discussion with the Ministry of Finance, which should exert pressure on the education ministry to develop a realistic budget that prioritises actions and measures results. Decisions should align with the government’s broader national development agenda and adequate resources should be allocated with more predictability based on strategic plans.

copy the linklink copied!
Table 5.2. Examples of items from Serbia’s current action plans

Action

Instruments for implementation of the action

Outcome – Result of action

Progress indicators

Start

End

Responsible agencies and partners

Development of system of final exam in secondary education: comprehensive, artistic and vocational final exams

  • Drafting laws and adopting bylaws

  • Developing the final exam model

  • Establishing connection with higher education in the process of preparing and implementing matriculation exam

  • Developing the system of baccalaureate quality monitoring

  • Developing the map of baccalaureate introduction and result application

  • Uniform system of taking all established final exams and beginning the implementation of that system

  • Number and quality of designed instruments

  • Number and quality of performed tests, quality of analyses, change of educational practice

  • Number of reviews

Feb 2015

June 2019

Ministry, IEQE

Elaborating all the components of continuous teacher development and advancement

  • Drafting laws and adopting bylaws

  • Establishing a fair, performance-based system of teacher evaluation

  • Establishing sustainable funding models for teacher advancement

  • Producing analyses of the effects of teacher advancement

  • Revising criteria for acquiring teacher certification to provide continued quality of teachers’ work (possibility of losing the title)

  • Better teacher quality by reinforcing teachers’ motivation for professional development

  • A more efficient teacher advancement system providing better quality of teaching

  • Harmonised teacher development and teacher advancement components

  • Number of defined indicators of teacher quality

  • Database of teachers with titles

  • Percentage of teachers who have advanced to specific titles

Feb 2015

Dec 2017

Ministry, IEQE, National Education Council

Source: MoESTD (2015[19]), Action Plan for Implementation of the Strategy for Education Development in Serbia 2020, Ministry of Education, Science and Technological Development.

copy the linklink copied!
Table 5.3. Proposal for items to be included in Serbia’s new action plan

Goals

Actions/sub-actions

Timeline

Lead agency/partner

Mid-term outcomes

Outcome

Implement the Matura at the end of upper secondary

Determine the responsible body(ies) for key administrative tasks

2019

Key administrative responsibilities are clear; body(ies) have adequate resources to undertake their role.

New Matura is taken by all students at the end of upper secondary education. Results determine university placements.

Develop examination syllabi and example test materials

2019-20

Examination syllabi and example test materials reflect the curriculum’s learning objectives.

Develop a Common Admissions System (CAS) for higher education (HE) placements

2019-21

CAS system is fully developed; universities have confidence in it.

Pilot the new Matura and review design

2021-22

The pilot covers a representative student sample; modifications to the Matura model are made based on an evaluation of the pilot.

Prepare schools and students for the new Matura

2022-23

All schools have received training/materials. Schools and students understand how the new Matura will operate and know which body they can direct questions to.

Implement new Matura

2023

All eligible students take the Matura in 2023; the vast majority (xx%) of university places are determined by Matura results.

Strengthen support and incentives for teachers’ promotion

Revise teacher standards and define competencies needed to move up levels and how to acquire them

2019-20

New standards clearly set out required competencies to move up to new levels; teachers are engaged in the development of the new standards and support them.

Teachers pursue promotion to higher levels.

Provide teachers with guidance and mentorship on how to select professional development opportunities that will help them move up the career path

2019-20

Teachers receive guidance and mentorship when selecting professional development; they know who to ask for further information.

Develop education advisors’ capacity to undertake appraisals, including guidelines and tools

2019-20

Education advisors understand what makes for an effective appraisal, and where they can receive further support.

Link the career path to the teaching salary scale

2019-20

Teachers’ salaries increase in line with international and regional practices.

Changes to promotion are communicated to teachers and schools

2020-22

Teachers and schools understand changes to the promotion system.

New promotion system is progressively implemented

2022

Xx% of existing teachers pursue promotion annually.

Most teachers understand and support the new promotion system.

Recommendation 5.1.3. Monitor progress to build accountability for achieving education goals

System monitoring has an accountability function, which determines if goals are being reached, and a learning function, which determines if defined strategies and policies are up to date in the current environment. It is not a stand-alone process but part of an ongoing, cycle (Bryson, Berry and Kaifeng Yang, 2010[24]; George and Desmidt, 2014[25]). Without a means to monitor the system continuously, countries risk producing an abundance of potentially out-of-date information that is not relevant for policymaking. System monitoring should not be an isolated technical process but rather create pressure for the government and education system to demonstrate progress. One of the key reasons that Serbia’s 2020 strategy has not fully achieved its objectives is the lack of outreach to raise awareness among policymakers and the public about progress towards achieving educational goals.

Strengthen the role of the special working group to monitor the strategy

Serbia established a special working group within the ministry to monitor the implementation of the strategy and action plan in 2018. To date, the group has only published one progress report which highlighted the need for better education statistics. To maintain the impetus for system improvement, hold the government accountable for progress and ensure alignment across different policy areas, the ministry should strengthen the role of the special working group to monitor the new education strategy and action plans.

One way to strengthen the role of the working group is to ensure the ministry’s leadership is personally invested in the strategy’s progress and raise the group’s prominence within the ministry. This could be achieved by having the minister lead the working group. Key representatives from each unit within the ministry, including officials from the National Education Council, the IIE and the IEQE, should also be invited to participate in the group to support comprehensive system evaluation.

Another way to strengthen the role of the working group is to organise regular (e.g. monthly) meetings to discuss progress and identify important challenges. These discussions do not need to be technical but should focus on taking stock of which actions have been completed and where progress is stalled. The technical research to inform these discussions should be carried out by the analytics group that this review recommends be re-established (see Recommendation 5.2.2), which could serve as a secretariat for this body. For example, the special working group could request the analytics group to produce a national report on progress towards achieving the strategy and undertake other specialised research. A summary of the discussions at these meetings could be published on a regular basis (e.g. quarterly) to keep the public informed of progress and success.

Develop platforms for regular reporting on progress

Most OECD countries regularly publish an analytical report on education (OECD, 2013[2]). National policy goals and priorities guide the content of this report. Typically, such reports describe progress against the targets of the national indicator framework and explain the strength and challenges of the system by studying related inputs, process, outputs and outcomes. For example, an analytical report might first describe the overall performance of students on a national assessment and examine this performance in relation to changes in school resource allocation and efforts to improve teacher assessment literacy. The report might also discuss future policies or activities intended to address certain challenges.

Serbia has only had one analytical report that took stock on progress towards achieving the current education strategy and with the exception of EU funding commitments, there is no expectation or timeframe for reporting on a regular basis during the strategy’s implementation. This makes it difficult for policymakers to make informed decisions and impedes the national education debate on education. Serbia should establish a regular reporting timeframe about progress towards achieving the education strategy. The ministry could aim to publish such a report every two years and then later on an annual basis, which would provide more stability than reporting intermittently or only at the end of the strategy. This report should be the responsibility of the ministry’s analytics group but, if capacity is an issue, it could be undertaken by external researchers. The reporting timeframe should also be accompanied by a dedicated budget, agreed upon by the government.

In addition to creating a regular analytical report on education, the Serbian ministry could develop other platforms to report on progress and success. For example, a performance dashboard could be added to the ministry’s website so that users can not only access an electronic copy of the strategy and action plan but see visual representations of progress towards selected indicators included in the national indicator framework (see Recommendation 5.2.1). Instead of developing a separate electronic database, the ministry could link the dashboard directly to the UISE through the open data website. This would ensure the dashboard always displays the most recent information to users without the need to wait for a report to be published (Eckerson, 2011[26]). Box 5.2 describes some of the procedures and tools that New Zealand and the United States use to provide regular, up-to-date information about the performance of their education systems. These efforts would support Serbia in communicating information about the education sector more effectively.

copy the linklink copied!
Box 5.2. Examples from New Zealand and the United States on providing regular up-to-date information about progress in education

In New Zealand, Education Counts is an online platform managed by the Ministry of Education that was built to increase the availability and accessibility of education data in the country. It provides a range of information, such as achievement and participation data, and allows users to filter by level of education and demographic background. The platform also provides tools such as Know Your Region, where it is possible to select a particular regional council or territorial authority and access data such as student attainment, student population, or student engagement specific to that area.

In the United States, the National Center for Education Statistics (NCES) is the country’s primary federal entity for collecting and analysing education data. The NCES provides current information about the American education system through its online database, allowing users to access information about the state of education from pre-school to the post-secondary level. The NCES also publishes an annual report that shows progress on key indicators, such as drop-out rates. The website and annual report help summarise important developments, progress and trends based on the latest national statistics, which are updated throughout the year as new data become available.

Sources: NCES (2019[27]), The Condition of Education, https://nces.ed.gov/programs/coe (accessed on 26 August 2019); Ministry of Education (2019[28]), Education Counts, https://www.educationcounts.govt.nz/home (accessed on 26 August 2019).

copy the linklink copied!Policy issue 5.2. Enhancing the availability and use of evidence for accountability and policymaking

Data is integral to system accountability and, as such, the ministry must ensure that the Unified Information System of Education (UISE) has the capacity to support a wide range of evaluation efforts. Primarily, regulations and processes around data collection and access should be standardised. While Serbia has attempted to establish a national strategy on education statistics between the ministry and the SORS, this has not been realised and leaves the country without a central, unified source for education data. Strengthening administrative data in the UISE will not only provide a valuable source of information to inform policymaking, it can also help drive improvements and ensure more efficient spending on education. In addition to increasing the availability of education data, Serbia should ensure that relevant information can be extracted and easily used. Without greater functionality, Serbia’s UISE will struggle to generate a stronger national understanding of the challenges and progress of the education sector.

Recommendation 5.2.1. Strengthen foundations for effective data collection and storage

High-quality and accessible data is integral to system evaluation and accountability. Currently, the parallel processes for data collection prevent Serbia from developing a unified source of reliable information about the education system and create a reporting burden for schools. Developing a national indicator framework could help Serbia measure and communicate progress towards national education goals. It would also serve as a basis for conducting a systematic mapping exercise of available, problematic and missing education indicators across various databases. To do this, Serbia will need to develop a formal data dictionary and sharing protocol to help improve the quality of education data and encourage actors to rely on the UISE for desired information. Finally, the ministry should consider using civil identification numbers with appropriate data security measures instead of separate student identifiers to maximise the analytical potential and policy relevance of education data.

Establish a national indicator framework to measure progress

A national indicator framework not only specifies the measurable targets associated with goals, but also the data sources that will be used to measure progress and the frequency of reporting around the indicator. Without this valuable component, system evaluation loses co-ordination around what data points to pay attention to, resulting in a general loss of systematic direction and fragmented goal-setting. In 2011, Serbia’s National Education Council proposed a set of indicators to help monitor the education system; however, this document is not currently used and some of the progress indicators in Serbia’s 2020 Education Strategy are vague. For example, the action to “strengthen the educational function of primary school” is measured by progress indicators such as best practices and models of work prepared (MoESTD, 2015[19]).

The lack of clear and measurable indicators inhibits the reporting and monitoring of system progress. As such, Serbia should review existing education indicators across various databases and develop a clear indicator framework to support the next education strategy. This could build on the proposed framework developed by the National Education Council but should be updated to include new data sources, such as the national assessment. This would support public accountability vis-à-vis national goals and identify data gaps to orient the future development of Serbia’s UISE. Box 5.3 shows the how Ireland included specific indicators in its Action Plan for Education 2018 to measure progress toward national goals for education.

copy the linklink copied!
Box 5.3. Example of Ireland’s indicator framework for the Action Plan for Education 2018

Ireland’s Action Plan for Education 2018 accompanies the country’s national education strategy 2016-19, setting out priorities and actions that the Department of Education and Skills and its technical agencies should undertake during the year. The action plan clearly aligns each action and sub-action to the country’s five main goals for improving the quality of its education system. Each goal is associated with a list of actions and a set of indicators that are used to measure progress. The first goal, “improve the learning experience and the success of learners”, identifies six objectives, followed by indicators, including for example:

copy the linklink copied!

Objectives

Indicators

1.2 Deliver a “step change” in the development of critical skills, knowledge and competencies to provide the foundations for participation in work and society

Increase the percentage of students taking higher-level maths at the end of Junior Cycle: 60% by 2020

Increase the proportion of students performing at Level 5 or above for reading in PISA: 12% by 2020

Decrease the proportion of students performing below Level 2 for science in PISA: < 10 by 2025

Increase the proportion of students performing at Level 5 or above for mathematics in PISA: 13% by 2020

1.6 Enable learners to communicate effectively and improve their standards of competency in languages

Percentage of candidates presenting a foreign language at the Junior Certificate/ Cycle Examination: 100% by 2026, 92% by 2022

Students studying a foreign language as part of their HE course: Support 20% of all HE students to study a foreign language as part of their course (2026)

Students doing Erasmus +: 4 100 HE students (2018/19)

Note: Junior Cycle in Ireland covers the first three years of secondary school. Starting age is around 12 or 13 years old. The Junior Cycle Examination takes place at the end of Junior Cycle in post-primary schools.

Source: Department of Education and Skills (2018[29]), Action Plan for Education 2018, http://www.education.ie (accessed on 9 August 2019).

Harmonise data collection by establishing clear definitions and protocols

While a national indicator framework can help orient reform efforts, Serbia will still need clear and harmonised protocols regarding the definition of indicators and data points across the education databases. Currently, education data is managed in parallel by the ministry (UISE) and the SORS. Moreover, while the Dositej platform aims to streamline the data collection process, there are no common data standards to ensure that all schools have a shared understanding of data definitions. The result is an increased risk that indicators or data points are reported in different ways, preventing Serbia from establishing a central source of reliable data about the education system. A formal data dictionary and sharing protocol would guide schools and actors within the SORS and ministry on how to define data, preferably in line with international standards, and encourage both government and peripheral requestors to turn to the UISE for their desired information.

Many countries have established strict protocols regarding the definition of data points and who can retrieve information from schools. For example, to ensure consistency for national-level reporting and analysis across individual states, the United States Department of Education has created the Common Education Data Standards, which defines education data around the country (Department of Education, n.d.[30]). By implementing common data standards, national education policymakers can be confident that data from different states have the same meaning and can be relied upon to inform federal decision-making. Moreover, the United States also regulates who can collect data from schools. For example, if government parties wish to contact schools to collect information, they must undergo a rigorous screening process that is regulated by data sharing legislation (U.S. Department of Education, 2018[31]). These procedures help restrict outside access to school information, funnel data retrieval to the education database and limit direct collection from schools to data that cannot be found in the EMIS (e.g. interviews with teachers or students).

Develop processes to identify data gaps

High-quality data and indicators are crucial parts of making informed policy decisions. In Serbia, education statistics are not sufficiently reliable and present a major challenge to system evaluation (MoESTD, 2018[16]). Improving data quality and undertaking research to shed light on some of the “gaps” where data collection is too costly/not feasible are some of the ways in which the government can improve the quality of education data (OECD, 2013[2]). In particular, the national indicator framework, recommended by this review should be used to conduct a systematic mapping of available, problematic and missing indicators. This could help the ministry identify data gaps and orient the future development of the UISE. If, for example, Serbia sets a goal to improve the retention of vulnerable groups of students, the national indicator framework would indicate that UISE is the data source to be used to monitor this indicator It would need to collect data about students’ demographic or socio-economic profile, for example, and other measures of vulnerability. The lack of available indicators to measure progress towards this goal would signal UISE staff to prioritise developing capacity and data collection procedures to support this indicator.

Link education data to data stored by other agencies

The ministry’s plans to introduce a unique identifier that will follow individuals throughout their educational trajectory is a noteworthy innovation for Serbia’s UISE. This will allow for integrated analysis of the education system, for example, by producing information to calculate real drop-out rates and analysing the relationship between student-teacher ratio vis-à-vis assessment results. However, the current design of the unique identifier limits the analytical potential of Serbia’s education data since it will not link education data to other government databases. This contrasts with most modern EMIS systems which use the national/civil identification number of students, rather than creating student identifiers (Abdul-Hamid, 2014[11]).

There are several advantages to using civil identification numbers. First, these numbers are inherently standardised and therefore will follow a standard structure across all education databases, including vocational education and training and higher education. Moreover, because they exist nationally, civil identification numbers can be used to research different sectors (e.g. if one wishes to study education outcomes and labour market success). Finally, by using this identifier, much student information can be retrieved automatically into UISE by linking the system with the national registry, which greatly improves data quality and reduces the data entry burden on schools. Of course, managing civil identification numbers should be done carefully, with strict protocols about who can access data, how they can access and use it and when data should be anonymised to protect student privacy.

Recommendation 5.2.2. Support the use of data and evidence in policymaking

To strengthen the use of data and evidence in policymaking, Serbia needs to build the capacity of technical staff and key actors across the system. Primarily, this involves re-establishing the ministry’s analytics group which was created in 2014 to collect and analyse education data and policies, no longer operational because of significant fluctuations in staff numbers. It also involves strengthening the IEQE and drawing on the wider research community to undertake analysis and conduct evaluations that can inform policymaking. Without stronger capacity, using data and evidence to inform policies will likely remain a challenge for Serbia (MoESTD, 2018[17]).

Re-establish the analytics group in ministry

Using data and evidence in policymaking requires having enough people with the right skills to support system evaluation. For example, these individuals should conduct regular policy evaluations that consider past and international experiences. While Serbia’s IEQE already has the capacity to assume some of these responsibilities, their staff members are not directly involved in the policymaking process. To bridge this gap, Serbia should re-establish the ministry’s analytics group with a mandate to feed data and evidence into the special working group responsible for monitoring the education strategy. The group could also be tasked with managing the UISE and reviewing and implementing some of the recommendations presented in this review. For example, it might introduce standardised data definitions and protocols or develop a national report on the performance of the education system.

This will require additional staff capacity and resources since the ministry will likely need more than three individuals and a range of profiles, including statisticians and people with experience in research and policy analysis to help make sense of the data and provide recommendations for policy. In Georgia, for example, the EMIS employs five statisticians solely for responding to data and research requests, in addition to department leadership, administrative support and software developers who manage the system.

Strengthen the IEQE’s capacity and resources

While the appointment process for senior management of the IEQE is in line with the practices of OECD countries, the institute is operating within the context of a limited budget and insufficient staff with the right technical expertise. This makes it difficult for the institute to fulfil its broad mandate of supporting evaluation and assessment in Serbia’s education system. As the IEQE’s list of responsibilities continues to grow (they are now responsible for all national and international student assessments and exams), the government should strengthen the institute’s capacity and resources to ensure its effective operation. The current public sector hiring freeze is hindering the IEQE’s ability to address its staffing deficit. Until this ban is overturned, one possible way that Serbia could address resource issues involves agreeing to a multi-year activity programme and related budget. Currently, the institute’s budget is planned on a three-year basis but is approved annually, making it difficult to ensure important research and evaluation activities in the long term and hire external consultants to support the IEQE in fulfilling its mandate.

Make greater use of the research community for policymaking

  • Serbia has a strong research community that produces extensive evidence about the education system which feeds into the policymaking process. For example, independent researchers developed the new Matura proposal and have provided national analysis of PISA data. The Serbian ministry also funds education research and has tested various mechanisms to support research activities, for example by seconding staff between education authorities and academic and organising conferences. Other countries support their research communities in this way by providing funds for a university department to create a platform to share call for tenders and post-research. However, there is a lack of alignment between research projects and the needs of Serbian policymakers. This could be improved by making the analytics group responsible for commissioning, publishing and hosting education research. Box 5.4 provides an example of how the research arm of the United States’ Department of Education organises research activities to guide and inform policy.

copy the linklink copied!
Box 5.4. Ways to encourage and support the research community

In the United States, the Institute of Education Sciences (IES) is the statistics, research and evaluation arm of the U.S. Department of Education. The IES is responsible for providing evidence to guide educational practice and policy. Under its Education Research Grants Program, the IES has established 13 programmes of research on different topics regarding the education sector. With applications accepted once a year, topics range from “Early Learning Programs and Policies” to “Improving Education Systems”. Eligible applicants include but are not limited to public and private agencies and institutions, such as colleges and universities, and non-profit and for-profit organisations.

Source: IES (2019[32]), Education Research Grants Program, https://ies.ed.gov/funding/ncer_progs.asp (accessed on 26 July 2019).

Recommendation 5.2.3. Improve the functionality of UISE to make data more accessible

One reason why data from Serbia’s UISE is not used more widely is that its functionality is limited to data entry and storage. Effective EMIS systems also have strong analysis and reporting functionalities (Villanueva, 2003[33]). These features should be available to all interested parties since it can encourage the public to consult the UISE as the central source for information about the Serbian education system. Improving the functionality of the UISE can also support Serbia to communicate proactively about the performance of the education system.

Disseminate data more effectively to inform education actors and society

Real-time access to data through a public web portal (accessible by anyone, not just those with ministry credentials) is a common international method of extracting information from EMIS databases and presenting it in an accessible manner. At the most fundamental level, users will be able to know how many students attend a school and how they perform on a national assessment. More sophisticated systems, such as EdStats in the United States, aid external research and analysis by facilitating comparison across schools, aggregation at different levels (e.g. regional or national) and providing a set of data visualisation tools (Abdul-Hamid, Mintz and Saraogi, 2017[34]). Serbia’s DevInfo website, which is managed by the SORS provides public users with an interface to explore a limited amount of education data. The ministry could build on this example by creating an online platform that is easy to use and draws on select data from the ministry’s UISE. The platform should contain reporting features to create dynamically generated charts and figures and export data for further analysis. Parents and students could use the portal to make important decisions and help hold the system accountable. Researchers would be able to use this portal to study the education system and contribute to system evaluation efforts. Insights from this tool could help encourage greater use of data to monitor educational progress and establish a national education debate.

Help schools to make greater use of data

In addition to making education data more assessable to the public, Serbia should support schools in making greater use of data. Building on recent development of the eClass Register pilot project, the ministry should explore the potential for expanding this to become an open data portal for schools. This portal would link to the UISE system, making real-time administrative and learning outcome data accessible in a user-friendly format to a wider range of education actors. The portal should not only allow schools to input data (e.g. attendance) but also export it. For example, a principal might want to know the attendance rate of students according to grade levels. The portal could include a reporting feature that allows the principal to specify that he/she wishes to create a two-column table in which the first column lists grade levels and the second indicates the attendance rate of students from that grade. This type of advanced functionality would allow education data to be filtered by time period and generate graphical charts to depict the results. Every time a report is “run”, the system would populate the defined objects with the most recent data (Abdul-Hamid, 2014[11]). Other types of data that could be accessible in this portal are:

  • Student profile. This might include information disaggregated by gender, mother tongue and socioeconomic background (in the future).

  • School context. Data could be filtered according to where a school is located (rural or urban), teacher-student radio, etc.

  • Outcomes. This might include drop-out rates or learning outcomes (taking care to avoid the creation of league tables or other test-based accountability structures that can have negative consequences). Census data from the Grade 2 national assessment recommended by this review (see Recommendation 5.3.1) should only be available to schools but results could be aggregated by Regional School Authority (RSA) and shared publicly.

The portal should include a function that allows users to make contextualised comparisons of outcomes across schools operating in similar contexts or groups of students with similar profiles.

copy the linklink copied!Policy issue 5.3. Developing the national assessment to support system goals

National assessments that provide regular and reliable data on student learning outcomes can inform education policy, support strategic planning and help drive system improvement (OECD, 2013[2]). Results from these assessments can also be used to better understand how students are performing and serve as a reference for teachers’ classroom marking. In Serbia, system evaluation relies on periodic international assessments and the final exam of compulsory schooling to provide information on student learning. However, international assessments do not allow for comparisons at the local level (across RSAs) and are not specific to the Serbian context. For example, a large-scale international assessment may not test competencies that are included in the Serbian curriculum, such as transversal skills. Moreover, the final exam of compulsory education is not fully standardised and assesses a relatively limited range of competencies (see Policy issue 2.3). As a result, timely, reliable information about the extent to which students are meeting national learning standards is very limited.

While Serbia’s new Matura exam will provide an additional source of information about student learning, it will not address the gap in data on learning outcomes for earlier years of schooling. To address this, the ministry introduced a pilot national assessment in 2017/18. The pilot was developed centrally by the IEQE and consisted of a sample-based assessment for Grades 7 and 11. Results will be available in 2019 and discussions are currently underway about using these findings to establish a new national assessment system. However, there is no clear mandate to develop this tool in the country’s education strategy and action plans. As such, despite having some plans in regards to the design of the new assessment (its frequency, what grades and subjects will be assessed, etc.), no official decisions have been made. There are also no plans for financing the new assessment. This is a concern since the lack of an adequate budget is one of the reasons Serbia has not administered a national assessment since 2006 (World Bank, 2012[20]). This review provides suggestions on how Serbia could advance the development of the national assessment and establish it as a key instrument to support system goals for learning and equity.

Recommendation 5.3.1. Consider the design options to align the national assessment with its stated purpose

The main purposes of a national assessment in most EU and OECD countries is to support system monitoring, provide formative information about learning and to serve as an accountability tool (OECD, 2013[2]). National assessments can serve one or a combination of these purposes. Currently, Serbia aims to design its new national assessment for the primary purpose of system monitoring. However, the national assessment could also provide information on other issues where the ministry would like to have more data. For example, it could help monitor the transition of students from class to subject-based teaching, the implementation of the new curriculum or the quality of teachers’ classroom assessments.

The stated purpose of a national assessment closely impacts its design and implementation. As such, the following section provides recommendations on how Serbia could build on the pilot assessment to design a national assessment system that fulfils the stated purpose of system monitoring while supporting broader education policy goals. The following analysis is guided by a set of key considerations, outlined in Table 5.4, which any country needs to review when determining the design of a national assessment. This review suggests that Serbia create a steering group to lead the development of the national assessment (see Recommendation 5.3.3), which could be tasked with making decisions on these design questions. This review recommends the following options.

copy the linklink copied!
Table 5.4. Key decisions regarding national assessment

Topic

Options

Advantages

Disadvantages

Subjects

Many

Broader coverage of skills assessed

More expensive to develop, not all students might be prepared to take all subjects

Few

Cheaper to develop, subjects are generalisable to a larger student population

More limited coverage of skills assessed

Target population

Sample

Cheaper and faster to implement

Results can only be produced at high, aggregate levels

Census

Results can be produced for individual students and schools

More expensive and slower to implement

Grade-level

Lower

Skills can be diagnosed and improved at an early stage of education

The length of the assessment and the types of questions that can be asked are limited

Upper

More flexibility with respect to the length of the assessment and the types of questions that are asked

Skills cannot be evaluated until students are in later stages of education

Scoring type

Criterion-referenced

Results are comparable across different administrations

Results require expertise to scale and are difficult to interpret

Norm-referenced

Results are easier to scale and interpret

Results are only comparable within one administration of the assessment

Item type

Closed-ended

Cheaper and faster to implement, items are more accurately marked

Can only measure a limited amount of skills

Open-ended

A broader set of skills can be measured

More expensive and slower to implement, marking is more subjective in nature

Testing mode

Paper

The processes are already in place and the country is familiar with them, requires no additional capital investment

Results are produced more slowly, seen as more old-fashioned

Computer

Results are produced more quickly, more cost-effective in the long term, seen as more modern

New processes have to be developed and communicated, requires significant initial capital investment

Sources: Adapted from DFID (2011[35]), “National and international assessment of student achievement: A DFID practice paper”, https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/67619/nat-int-assess-stdnt-ach.pdf (accessed on 13 July 2018); OECD (2011[36]), Education at a Glance 2011: OECD Indicators, https://doi.org/10.1787/eag-2011-en.

Implement national assessment in Grades 2 and 6, and consider Grade 10 in the future

Currently, the ministry plans to administer the national assessment in Grade 6. While this would fulfil the need for more data on learning outcomes at the lower secondary level, it leaves the country with little information about learning in the early primary grades. This is a concern, given that the consolidation of foundational cognitive skills in the first years of schools is essential for future learning. For this reason, most OECD countries assess student learning in at least one grade of primary school. As such, this review recommends administering the national assessment in both Grades 2 and 6. If additional resources are available in the future and after the assessment in Grades 2 and 6 have been established, Serbia might consider administering a national assessment in Grade 10. This could allow for broader measurement of the curriculum by testing subjects that may not be covered by PISA or the new Matura.

  • Administer the national assessment for primary education in Grade 2.

    • Currently, Serbian teachers are required to administer school-based diagnostic tests at the beginning of each academic year. This review recommends that Serbia standardise the content of these assessments and establish mandatory initial tests for Grades 1 and 5 (see Chapter 2). This will provide comparable data about student learning at the start of the two cycles of basic education. However, since the marking of these tests will not be standardised, results cannot not be used as a reliable source of information to monitor the first cycle of primary school.

    • The review team was informed that one of the reasons Serbia had not chosen to administer a national assessment in the early years of schooling was because of the strong performance of students in the TIMSS Grade 4 survey. However, there is a risk in relying on this one measure to form an opinion on learning in the critical early years of schooling, especially when this measure does not cover reading literacy. To further support system monitoring in the early years of primary, Serbia could consider conducting a national assessment in Grade 2. This would be administered to the full cohort of students in the second half of the school year, giving teachers an external reference point to moderate or benchmark their classroom assessments. The design, delivery and scoring procedures of the Grade 2 national assessment must be appropriate for very young learners.

    • Importantly, Serbia would need to ensure that these externally marked assessments are not interpreted as having summative consequences, which could have negative consequences. It is important to communicate that the assessment is for system monitoring and diagnostic purposes only. A national assessment in Grade 2 would give students one year to adjust to formal schooling but still help teachers identify learning needs early enough to address achievement gaps before they become problematic. This assessment would also provide valuable insights about student learning at a stage where the national perception of education quality is good.

  • Implement plans to administer the national assessment in Grade 6.

    • This review supports Serbia’s plans to administer the new national assessment in Grade 6. This would provide information on student learning one year after the transition into the second cycle of education (Grades 5 and 9), addressing the need for better data to understand how the transition from class-based to subject-based teaching impacts learning. It would also fill an information gap between Grades 5 and 8, stages which can respectively draw on available data from the new initial diagnostic test (Grade 5) and the final exam of compulsory education (Grade 8). Serbia might also consider introducing links between Grade 4 TIMSS survey and the Grade 6 national assessment both for the test instruments and background questionnaires. This would allow for comparative analysis on important research questions, such as “Do Serbian students become less engaged in school after Grade 4?”.

  • Consider administering a national assessment for Grade 10 in the future.

    • When Serbia implements its new Matura exam, this will provide reliable data about student learning at the end of upper secondary. However, there will still be a gap in reliable data in the first years of secondary school. International assessments such as PISA can help fill this gap but do not provide information on the extent to which students are mastering the national curriculum. As such, this review recommends that once the Grade 2 and 6 assessments have been established, the steering committee consider administering a national assessment in Grade 10 should additional funding becomes available.

    • An assessment in Grade 10 could help develop test items for the Matura. The IEQE already used the recent pilot assessment to test new Matura items and could continue this practice to adjust the exam in the future. The Grade 10 assessment would also allow Serbia to measure broader competency areas that align with national priorities. For example, the subjects assessed in Grade 10 might include foreign languages, science, technology, engineering and mathematics (STEM) fields, digital competencies, social and civic competencies, entrepreneurship and intercultural skills, which are among the key competencies of Serbia’s education law (MoESTD, 2018[17]). The IEQE could alternate the subjects, assessing them in different years to reduce the cost of administering multiple assessments at the same time.

Maintain plans for sample-based assessment but consider census-based assessments in the future

To ensure the 2018 pilot national assessment was representative, the IEQE stratified the student sample by RSA. However, during the review mission, the ministry mentioned that the sample might be extended to provide analysis at the municipal or school level. This would require more students and schools progressively participating in the assessment to maintain precise and reliable comparisons. It is not clear that sampling at the district level would provide added value beyond the existing sampling at the RSA level since these units of analysis are not that different (there are 17 RSAs and 29 districts). Serbia would not be able to sample at the school level because the average class size per grade is too small. As such, to make school-level comparisons, the assessments would need to be census-based.

This review recommends that Serbia maintain the current plans to stratify the sample by RSA for the Grade 6 assessment but make the Grade 2 assessment census-based once the instrument has been developed (see above and Table 5.5). If more resources are available in the future, Serbia could also consider making the Grade 6 assessment census-based. This would provide data that could be used formatively to improve teaching and learning within and across schools; however, this option would be considerably more expensive and require additional capacity to implement. Moreover, this review recommends that Serbia maintains a sample-based assessment in Grade 10, should this be developed in the future. This will help avoid the perception that the Grade 10 assessment has consequences for students at a time when they are starting to prepare for the Matura exam.

Develop a timetable to assess foundation skills in Grades 2 and 6

Serbia’s 2018 pilot national assessment tested students’ knowledge in mathematics, physics and history but the country only has guaranteed funding to develop tests for two subject areas. Focusing on a limited number of subjects is consistent with the national focus to relieve testing pressure on students and schools (see Chapter 2). It also creates space to include more questions within each subject to gain better insights into areas where students struggle to meet learning standards. As such, this review recommends that Serbia’s national assessments maintain the mathematics subject from the pilot but replace the physics and history test with an assessment of literacy in either the Serbian language or mother tongue. These subjects were assessed in Serbia’s previous national assessment, which was discontinued after 2006 (World Bank, 2012[20]). Reintroducing these subjects in the new national assessment of Grades 2 and 6 could help the Serbian education system strengthen the foundational skills of students.

The frequency in which countries assess mathematics and literacy is somewhat varied. For example, among OECD countries with national assessments at the lower secondary level, around 60% of countries test students in mathematics on an annual basis; in literacy, this share is 64% (OECD, 2015[12]). Other countries assess subjects on a rotation or alternate basis. To generate regular, predictable and timely information about learning outcomes for system monitoring, Serbia should develop a clear timetable to identify the frequency that subjects will be assessed by the national assessments in Grades 2 and 6. Since annual testing is costly, Serbia could assess foundation skills in Grade 6 every 2 years but aim to administer the Grade 2 assessment annually once the instrument has been developed (see Table 5.5).

If additional funding is made available after the national assessments in Grades 2 and 6 are fully operational, Serbia might then consider introducing a wider range of subjects on an alternative basis every 2-3 years for the Grade 10 assessment (see above and Table 5.5). This could provide information about student learning in areas relevant to the country’s economic development. However, caution should be taken when adding subjects as this will add to the costs of administering the assessment and requires greater implementation capacity.

copy the linklink copied!
Table 5.5. Proposal for organisation of cycles for new national assessment

Year N

Year +1

Year +2

Year +3

Year +4

Year +5

Year +6

Year +7

Year +8

Year +9

Year +10

Grade 2

S*

M, L

C

M, L

C

M, L

C

M, L

C

M, L

C

M, L

C

M, L

C

M, L

C

M, L

C

M, L

Grade 6

S

M, L

S*

M, L

S*

M, L

C

M, L

C

M, L

C

M, L

Grade 10

S

M, L, Sc.

S

M, L, FL

S

M, L, Sc.

Notes: C = census; S = sample; M = mathematics; L = language; Sc.= science; FL = foreign language.

* Serbia should consider moving towards a census-based assessment in the future.

Use challenging test items that are designed to assess student learning

In Serbia, some of the sample questions from the pilot national assessment that were shared with the review team required using higher-order thinking skills. This demonstrates the IEQE’s efforts to align test questions with the competencies included in the new curriculum and student achievement standards. The capacity developed through this process can also support the country’s efforts to reform the content of high-stakes examinations. However, considering the small number of sample questions available for review and the lack of statistical data on results from the pilot national assessment, this review is unable to make general conclusions about the type of questions that will be included in new national assessment. Nevertheless, Serbia will need to ensure that test items in the national assessment do not encourage memorisation and that proper item-writing convention is followed, such as reviewing the tests and items for potential bias and varying the placement of distractor choices (the incorrect options in a multiple-choice test) (Anderson and Morgan, 2008[37]). Distractor choices should also represent common mistakes made by students.

Consider computer-based assessment delivery

The use of computers to administer national assessments is becoming more common, particularly in countries that have introduced national assessments relatively recently (OECD, 2013[2]). Compared to paper-based delivery, computer-based testing has several advantages. It tends to be cheaper to administer (aside from the initial capital investment), less prone to human error and integrity breaches in the administrative procedures and the results are delivered more quickly. Computer-based assessments also allow for greater flexibility in terms of developing test items that assess interdisciplinary skills in real-world contexts. This is an area Serbia would like to develop and an investment that could benefit the national exam system since students could take the Grade 8 exam and the Matura on the computer in the future.

Serbia’s pilot national assessment is currently paper-based. This allows Serbia to focus on finalising the development of the assessment instrument and procedures for its implementation. However, in the medium to long term, Serbia should consider moving towards a computer-based assessment. This will require overcoming key challenges, in particular the lack of technological infrastructure in schools (hardware, software, connectivity and technicians) and ensuring that teachers and students are familiar and comfortable with computer-based approaches to teaching and testing. When resources allow Serbia to make the transition to a computer-based assessment, the digital version should mimic the paper version to the greatest possible extent. This would allow researchers to compare student results using the different delivery methods and help ensure the reliability of the new testing approach. Before fully implementing the digital assessment, Serbia should evaluate the system’s readiness, address remaining issues and run a communications campaign to prepare schools, teachers, parents and students for the new computer-based national assessment.

Recommendation 5.3.2. Disseminate and use results from the national assessment to inform education policy

Considering the resource demands related to implementing national assessments, it is critical to optimise this tool by communicating findings in an appropriate form for interested parties (Kellaghan, Grenaney and Murray, 2009[38]). While developing and establishing a reliable national assessment should be Serbia’s top priority, the country should also reflect on how to most effectively report assessment results to support improvements in the education system. In particular, thought should be given to how results from the national assessment can be used to inform policymaking and drive improvements in teaching and learning.

Serbia plans to implement a new national assessment in the primary and secondary levels of education. The country will need to determine how results are reported and to which audiences. The uses and consequences of the data should also be made clear. These decisions should be taken with caution to avoid potentially negative consequences. Adequate financial resources for the dissemination of results should also be considered in central planning and budgeted accordingly.

Disseminate results in different ways

The IEQE plans to produce a national report to publically disseminate results from the new national assessment and inform the policymaking process. This will not only help inform policy questions such as the extent to which students are mastering the curriculum but also support greater transparency and public accountability. However, reporting must be done with care to avoid potentially negative consequences, such as using the results to produce decontextualised rankings or attaching high-stakes accountability measures. To promote responsible dissemination and use of assessment results, Serbia’s national report should include three core components:

  • Provide context. The report should set the context of the assessment by highlighting its relevance for policymaking. For example, it could clearly state how the instrument supports monitoring of the curriculum, education strategy and sustainable development goals (SDGs).

  • Include technical details. The report should clearly state the objectives of the national assessment and the framework that guides its design and methodology. This level of transparency is an important part of establishing the assessment as a valid measure of student achievement and building public trust in both the assessment process and results.

  • Present results. The report should provide a description of achievement results and correlations according to background information that is relevant for national policy. In particular, results might be disaggregated by gender, mother tongue language, the geographic location of school or socio-economic background. Over time, the report should also provide trend data to offer a picture of how student performance in Serbia evolves. The Australian Curriculum, Assessment and Reporting Authority (ACARA), for example, publishes an annual report that presents comparisons within jurisdictions and trend data from the National Assessment Program (NAPLAN). The ACARA also has a dedicated website for assessment results which allows users to disaggregate results by Indigenous status, language background (other than English), geographic location, parental occupation and level of education (ACARA, n.d.[39]).

In addition to a national report, Serbia should consider other ways to make data from the national assessment more accessible to the public and policymakers. For instance, the IEQE or the ministry could develop infographics, factsheets or short briefs that target different audiences. The IEQE could also create a dedicated webpage for the national assessment that provides information about its context, technical details and results. In Norway, for example, the Directorate for Education and Training has a website for national assessments that addresses frequently asked questions, offers guidance for schools and municipalities on how to make use of the data and includes a data portal where users can filter results and extract data to conduct different types of analysis. In addition to creating a website for the national assessment, Serbia could link results data to the ministry’s improved open data portal or the eClass Register (see Recommendation 5.2.3).

Providing data from Serbia’s national assessment in a public and user-friendly data portal can make this information more accessible to a wider range of stakeholders, especially when the data is easy to extract, download and present. However, it can also encourage researchers to conduct secondary analysis of individual questions, topics or skills that would be important to identify at a national level if students in Serbia tend to struggle more with certain competencies or in certain domains. For example, this information might reveal the need for reflection on how teaching in certain parts of the curriculum can be improved. Making assessment data public can also help investigate dimensions of educational inequity that are not yet well analysed or understood.

Avoid decontextualised rankings of individual schools in census assessments

When census data from the Grade 2 (and eventually Grade 6) national assessment become available (see Recommendation 5.3.1), student information should be anonymised to protect privacy. However, Serbia will need to carefully assess the potential risks and benefits of publishing school-level results and develop a policy for how this information can be used most effectively. While reporting the performance of individual schools can support transparency and accountability, using a single indicator, such as a school result on an assessment, is not an accurate indication of the school’s effectiveness as it does not consider factors outside of the school’s control (OECD, 2013[2]). Instead, Serbia could identify different benchmarks against which schools can compare themselves (Kellaghan, Grenaney and Murray, 2009[38]). For example, school-level information could be presented alongside contextualised comparison groups, such as gender, linguistic minorities and RSAs, as well as the country as a whole.

Use results to help inform teaching and learning practices

In addition to making results available for broad public dissemination and research, Serbia should report national assessment results in a way that supports teachers and schools. For example, Serbia could develop a national report for teachers to leverage the formative value of the assessment. In particular, this teacher report should contain item-level analysis with information about how students across the country performed on each item. This “item map” could include concrete examples of what students should know and be able to do across the ability range. It might also analyse common errors that students made, with suggestions on how to improve teaching of the same content in the future. When the Grade 2 assessment comes, census-based, private reports could be generated for the teachers and school leaders in each school. For sample-based assessments, each participating school might get their own private report. The findings from these reports can help inform initial teacher education and teacher professional development.

Recommendation 5.3.3. Ensure the sustainability of the national assessment

In the past, Serbia’s national assessments were financed by donors on an ad-hoc basis and without plans or government funding to carry out these exercises in the medium to long term (World Bank, 2012[20]). This partly explains why the country has not conducted a national assessment since 2006. It also highlights the need for policymakers to ensure that the new national assessment has the capacity and resources needed to establish this instrument as a reliable tool for system evaluation. While Serbia appears to have the political will to introduce a new framework for national assessment, the country must address a number of potential threats to ensure the assessment’s sustainability.

The biggest threat to the sustainability of Serbia’s new national assessment is the lack of stable funding. Currently, Serbia has allocated funds to develop a sample-based assessment that covers two subject areas; however, the country’s current education strategy and action plans make no explicit reference to a national assessment, making it difficult to ensure continuity. There are also concerns about the capacity of the IEQE, which will lead the development of the new national assessment. The IEQE is already operating within the context of a limited budget and a growing list of responsibilities that already includes reforming national learning standards and examinations (MoESTD, 2018[17]).

Embed the national assessment in Serbia’s new education strategy

Serbia could support the national assessment’s sustainability by including its development and implementation as an indicator in the country’s new education strategy 2030. This was absent from the current strategy but its inclusion could highlight the importance of having a national assessment that supports system improvement. Moreover, the data generated from the national assessment could be used to help measure learning goals included in the new strategy. Achievable targets should accompany these goals. For example, Serbia could set a goal to improve the learning outcomes of disadvantaged students and a target might be to have no more than X% of students score at Level 1 by 2030. Of course, this can only be done after the national assessment has been established and results are analysed to determine feasible goals and targets.

Establish a steering committee to make national assessment a political priority

Another way that Serbia can ensure the new assessment’s sustainability is to make it a political priority by creating a high-level steering committee. This committee could be led by the minister, which would help provide leadership to defend the assessment’s validity when results are released, ensure adequate financial support is received and co-ordinate the efforts of RSAs, schools and teachers to implement the assessment instrument. The steering committee could take decisions about the new assessment’s design, implementation and use (see Recommendation 5.3.1), ensuring it aligns with curriculum reforms, school evaluation and national education policy goals. Another responsibility could be defining the wider national assessment framework (see Table 5.6). Once the steering committee determines what is technically feasible in the Serbian context, it could prepare a concept note to plan for the national assessment’s development. The OECD review team was informed that Serbia intends to establish a dedicated group to fulfil this purpose; however, at the time this report was drafted no concept note for the new national assessment had been prepared.

In addition to the minister, other key members of the steering committee could include diverse stakeholders who represent different backgrounds and interests. The steering group should also include technical expertise on the development and use of national assessments, such as the director of the IEQE and the heads of other education agencies. Serbia might also consider drawing on international experience by inviting an international advisor to join the steering committee or studying the case of another country that has been successful in developing and running a national assessment. For example, North Macedonia is reviewing the Slovenian national assessment experience to develop its own national assessment (see Box 5.5). The steering committee’s mandate and activities will need to be clearly documented to promote transparency if it is to become an official body that guides the development of Serbia’s national assessment.

copy the linklink copied!
Box 5.5. The Slovenian national assessment experience

The official objective of the Slovenian National Assessment of Knowledge (NAK) is to improve the quality of teaching and learning in Slovenia. As such, the national assessment is low-stakes and does not affect students’ marks or their progression into higher levels of education. A notable exception to this regulation is that student results can be used to determine secondary school enrolment if spaces are limited in certain schools.

As of 2006, the assessment is administered annually to students in Grades 6 and 9. Students in Grade 6 take mother tongue, mathematics and a foreign language, while students in Grade 9 take mother tongue, mathematics and a subject selected by the minister from a pre-defined list. The Slovenian National Examinations Centre is responsible, through various committees, for creating the guidelines, items and materials of the assessment. A separate organisation, the National Education Institute, is responsible for creating the marking procedures, training the markers and performing research and analysis using the results.

Results from the assessment are reported at the student, school and national levels. Students receive an individual report that can be accessed electronically. The report identifies the student’s performance in terms of how many questions were answered correctly, the percentage of questions that were answered correctly and classifies students into one of four proficiency levels. Students’ results are compared to his/her school average and the national average. Item-level analysis, showing how the student performed on different types of questions, is also provided.

Schools receive a report that shows the average performance of the students in their school compared to regional and national averages. At the national level, a report that summarises the results of the country is produced every year. The results are disaggregated by grade, subject, gender and region. All annual reports are published on line. National surveys reveal that over 90% of head teachers consider their students’ national assessment results in their future work and over 80% of all teachers believe that the assessment results give them useful information about their work.

Sources: Eurydice (2018[40]) Assessment in Single Structure Education – Slovenia, https://eacea.ec.europa.eu/national-policies/eurydice/content/assessment-single-structure-education-35_en (accessed on 23 September 2018); Brejc, M., M. Sardoc and D. Zupanc (2011[41]), OECD Review on Evaluation and Assessment Frameworks for Improving School Outcomes: Country Background Report Slovenia, http://www.oecd.org/education/school/48853911.pdf; RIC (2006[42]), Državni Izpitni Center (RIC), [National Examinations Centre], https://www.ric.si/ (accessed on 13 November 2018).

Make plans to ensure sufficient capacity and resources for national assessment

To ensure the sustainability of Serbia’s new national assessment over the medium term, the country will need sufficient technical competency and financial resources. Drawing on the experience of administering Serbia’s national examinations and more recently international assessments (since 2018), the IEQE currently has some of the infrastructure and capacity needed to administer large-scale assessments of student learning. For example, IEQE staff have expertise in sampling, test design and statistical analysis. However, these competencies need to be strengthened if a regular cyclic programme of national assessment is put in place as recommended by this review. Moreover, there appears to be no increase in funding planned for the IEQE, despite the institute’s additional responsibilities for international assessment and the new national assessment. This review recommends moving the institute’s external school evaluation functions to an independent agency to relieve some of the workload; however, even with this change, the IEQE’s 35 staff members will still be stretched to deliver a range of important education reforms.

While the IEQE is well-positioned to oversee the development and implementation of Serbia’s new national assessment, the government should include a multi-year budget to plan for the resources needed to sustain the assessment, at least for the duration of the next education strategy. This will reduce Serbia’s dependence on donor support for national assessment, allow the IEQE to hire staff with relevant competency profiles and invest in the technology infrastructure to carry out the new assessment fully. To ensure sustainability, Serbia should introduce the national assessment on a small scale, starting with only two grades and assessing foundation skills (see Recommendation 5.3.1). Plans for these assessments should be costed and secure. Then, Serbia could discuss whether to expand the national assessment framework to provide additional information about student learning in other grades and subject areas. These discussions should consider several factors including the results of the existing assessments and the extent to which they have been successfully implemented. It is also important to consider what resources are available to expand the national assessment system.

Establish an assessment framework for system monitoring

Table 5.6 proposes a holistic assessment framework for Serbia. This aggregates recommendations from across this review to demonstrate the various sources of information available to monitor student learning outcomes in Serbia. The new steering committee could be responsible for developing this framework.

copy the linklink copied!
Table 5.6. Proposal for a national assessment framework in Serbia

Grades

Assessment

Frequency

Population

Subjects

Primary purpose

Grade 2

National assessment

Two-year cycle to start, then annual

Sample to start, then census

Mathematics and Serbian language (or language of instruction)

System monitoring

Grade 4

TIMSS (international assessment)

Four-year cycle*

Sample

Mathematics and science

System monitoring

Grade 6

National assessment

Two-year cycle

Sample to start, then census

Mathematics and Serbian language (or language of instruction)

System monitoring

Grade 8

Final exam (end of basic education)

Annual

Census

Mathematics, Serbian language (or mother tongue), and combined test (see Table 5.5)

Student selection and certification

Grade 8/9

(age 15)

PISA (international assessment)

Three-year cycle*

Sample

Mathematics, science, reading

System monitoring

Grade 10

National assessment

Two-year cycle*

Sample

Alternate according to national priorities

System monitoring

Grades 11 or 12 (depending on cycle)

Matura exam

Annual

Census

Mathematics, Serbian language (or a recognised minority language) and electives (see Table 5.5)

Student selection and certification

Notes: This table is based on recommendations from across this review. It aggregates proposed and current sources of information on student learning that can be used for system monitoring.

* Serbia has participated in TIMSS at the Grade 4 level since 2011. Previously, only Grade 8 participated in TIMSS.

** Serbia did not participate in the 2015 cycle of PISA but participation has otherwise been consistent.

copy the linklink copied!Table of recommendations

copy the linklink copied!

Policy issue

Recommendations

Actions

5.1. Using the new education strategy to focus on achieving national priorities

5.1.1. Identify national priorities for the new strategy

Evaluate the 2020 strategy and other evidence to prioritise key strategic issues

Consider a range of evidence

Identify key national goals for education

Undertake a national consultation to develop the new strategy

5.1.2. Develop action plans and a monitoring framework with measurable targets

Create new action plans with specific actions and measurable outcomes

5.1.3. Monitor progress to build accountability for achieving education goals

Strengthen the role of the special working group to monitor the strategy

Develop platforms for regular reporting on progress

5.2. Enhancing the availability and use of evidence for accountability and policymaking

5.2.1. Strengthen foundations for effective data collection and storage

Establish a national indicator framework to measure progress

Harmonise data collection by establishing clear definitions and protocols

Develop processes to identify data gaps

Link education data to data stored by other agencies

5.2.2. Support the use of data and evidence in policymaking

Re-establish the analytics group in ministry

Strengthen the IEQE’s capacity and resources

Make greater use of the research community for policymaking

5.2.3. Improve the functionality of UISE to make data more accessible

Disseminate data more effectively to inform education actors and society

Help schools to make greater use of data

5.3. Developing the national assessment to support system goals

5.3.1. Consider the design options to align the national assessment with its stated purpose

Implement national assessment in Grades 2 and 6, and consider Grade 10 in the future

Maintain plans for sample-based assessment but consider census-based assessments in the future

Develop a timetable to assess foundation skills in Grades 2 and 6

Use challenging test items that are designed to assess student learning

Consider computer-based assessment delivery

5.3.2. Disseminate and use results from the national assessment to inform education policy

Disseminate results in different ways

Avoid decontextualised rankings of individual schools in census assessments

Use results to help inform teaching and learning practices

5.3.3. Ensure the sustainability of the national assessment

Embed the national assessment in Serbia’s new education strategy

Establish a steering committee to make national assessment a political priority

Make plans to ensure sufficient capacity and resources for national assessment

Establish an assessment framework for system monitoring

References

[11] Abdul-Hamid, H. (2014), What Matters Most for Education Management Information Systems, World Bank, http://www.worldbank.org (accessed on 16 July 2018).

[34] Abdul-Hamid, H., S. Mintz and N. Saraogi (2017), From Compliance to Learning: A System for Harnessing the Power of Data in the State of Maryland, The World Bank, https://doi.org/10.1596/978-1-4648-1058-9 (accessed on 8 December 2018).

[39] ACARA (n.d.), NAPLAN results, https://reports.acara.edu.au/ (accessed on 22 July 2019).

[37] Anderson, P. and G. Morgan (2008), Developing Tests and Questionnaires for a National Assessment of Educational Achievement, https://elibrary.worldbank.org/doi/abs/10.1596/978-0-8213-7497-9 (accessed on 3 August 2018).

[41] Brejc, M., M. Sardoc and D. Zupanc (2011), OECD Review on Evaluation and Assessment Frameworks for Improving School Outcomes: Country Background Report Slovenia, OECD, Paris, http://www.oecd.org/education/school/48853911.pdf (accessed on 10 June 2019).

[23] Bryson, J. (2018), Strategic Mlanning for Public and Nonprofit Organizations: A Guide to Strengthening and Sustaining Organizational Achievement, John Wiley & Sons.

[24] Bryson, J., F. Berry and K. Kaifeng Yang (2010), “The State of Public Strategic Management Research: A Selective Literature Review and Set of Future Directions”, The American Review of Public Administration, Vol. 40/5, pp. 495-521, https://doi.org/10.1177/0275074010370361.

[5] Burns, T. and F. Köster (eds.) (2016), Governing Education in a Complex World, Educational Research and Innovation, OECD Publishing, Paris, https://doi.org/10.1787/9789264255364-en.

[30] Department of Education (n.d.), “Common Education Data Standards (CEDS)”, https://ceds.ed.gov/ (accessed on 1 August 2018).

[29] Department of Education and Skills (2018), Action Plan for Education 2018, Government of Ireland, http://www.education.ie (accessed on 25 June 2019).

[35] DFID (2011), National and International Assessment of Student Achievement: A DFID Practice Paper, Department for International Development, London, https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/67619/nat-int-assess-stdnt-ach.pdf (accessed on 13 July 2018).

[26] Eckerson, W. (2011), Performance Dashboards: Measuring, Monitoring, and Managing Your Business, Wiley, https://www.wiley.com/en-us/Performance+Dashboards:+Measuring,+Monitoring,+and+Managing+Your+Business,+2nd+Edition-p-9780470589830 (accessed on 29 January 2019).

[15] European Commission (2018), Commission Staff Working Document: Serbia 2018 Report, European Commission 2018 Communication on EU Enlargement Policy [COM(2018) 450 final] , Strasbourg, https://ec.europa.eu/neighbourhood-enlargement/sites/near/files/20180417-serbia-report.pdf (accessed on 13 May 2019).

[18] European Commission (n.d.), European Policy Cooperation (ET 2020 Framework), https://ec.europa.eu/education/policies/european-policy-cooperation/et2020-framework_en (accessed on 8 July 2019).

[21] Eurostat (2019), Eurostat, https://ec.europa.eu/eurostat (accessed on 14 June 2019).

[40] Eurydice (2018), Assessment in Single Structure Education - Slovenia, https://eacea.ec.europa.eu/national-policies/eurydice/content/assessment-single-structure-education-35_en (accessed on 23 September 2018).

[25] George, B. and S. Desmidt (2014), “A State of Research on Strategic Management in the Public Sector: An Analysis of the Empirical Evidence”, in Joyce, P. and A. Drumaux (eds.), Strategic Management in Public Organizations: European Practices and Perspectives, Routledge.

[9] House of Commons (2011), “Accountability for Public Money: Twenty-eigth Report of Session 2010-11”, Committee of Public Accounts, https://publications.parliament.uk/pa/cm201011/cmselect/cmpubacc/740/740.pdf (accessed on 25 June 2019).

[32] IES (2019), Education Research Grants Programs, https://ies.ed.gov/funding/ncer_progs.asp (accessed on 26 July 2019).

[8] Kaplan, R. and D. Norton (1992), “The Balanced Scorecard: Measures that Drive Performance”, Harvard Business Review, pp. 71-79, https://steinbeis-bi.de/images/artikel/hbr_1992.pdf (accessed on 17 October 2019).

[38] Kellaghan, T., V. Grenaney and S. Murray (2009), Using the Results of a National Assessment of Educational Achievement, The World Bank, Washington, DC, https://doi.org/10.1596/978-0-8213-7929-5.

[28] Ministry of Education (2019), Education Counts, https://www.educationcounts.govt.nz/home (accessed on 26 August 2019).

[17] MoESTD (2018), OECD Review of Evaluation and Assessment: Country Background Report for Serbia, Ministry of Education, Science and Technological Development, Belgrade.

[16] MoESTD (2018), Progress Report on the Action Plan for the Implementation of the Strategy for Education Development in Serbia by 2020, Ministry of Education, Science, Technology and Development, Belgrade, http://www.mpn.gov.rs/wp-content/uploads/2018/08/AP-SROS-IZVESTAJ-15jun-Eng.pdf (accessed on 14 May 2019).

[19] MoESTD (2015), Action Plan for Implementation of the Strategy for Education Development in Serbia 2020, Ministry of Education, Science and Technological Development, Belgrade.

[14] MoESTD (2012), Strategy for Education Development in Serbia 2020, Ministry of Education, Science and Technological Development, Belgrade, http://erasmusplus.rs/wp-content/uploads/2015/03/Strategy-for-Education-Development-in-Serbia-2020.pdf (accessed on 17 October 2019).

[27] NCES (2019), The Condition of Education, https://nces.ed.gov/programs/coe/ (accessed on 26 August 2019).

[13] OECD (2018), Education Policy Outlook 2018: Putting Student Learning at the Centre, OECD Publishing, Paris, https://doi.org/10.1787/9789264301528-en.

[10] OECD (2018), Open Government Data Report: Enhancing Policy Maturity for Sustainable Impact, OECD Digital Government Studies, OECD Publishing, Paris, https://doi.org/10.1787/9789264305847-en.

[4] OECD (2017), Systems Approaches to Public Sector Challenges: Working with Change, OECD Publishing, Paris, https://doi.org/10.1787/9789264279865-en.

[1] OECD (2017), The Principles of Public Administration: Monitoring Report of Serbia, OECD Publishing, Paris, http://www.sigmaweb.org/publications/Monitoring-Report-2017-Serbia.pdf (accessed on 21 June 2019).

[12] OECD (2015), Education at a Glance 2015: OECD Indicators, OECD Publishing, Paris, https://dx.doi.org/10.1787/eag-2015-en.

[2] OECD (2013), Synergies for Better Learning: An International Perspective on Evaluation and Assessment, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264190658-en.

[36] OECD (2011), Education at a Glance 2011: OECD Indicators, OECD Publishing, Paris, https://doi.org/10.1787/eag-2011-en.

[7] OECD (2009), Measuring Government Activity, OECD Publishing, Paris, https://doi.org/10.1787/9789264060784-en.

[42] RIC (2006), Državni Izpitni Center (RIC) [National Examinations Centre], https://www.ric.si/ (accessed on 13 November 2018).

[3] Schick, A. (2003), “The Performing State: Reflection on an Idea Whose Time Has Come but Whose Implementation Has Not”, OECD Journal on Budgeting, Vol. 3/2, https://doi.org/10.1787/budget-v3-art10-en.

[31] U.S. Department of Education (2018), Privacy and Data Sharing, https://studentprivacy.ed.gov/privacy-and-data-sharing (accessed on 13 July 2018).

[33] Villanueva, C. (2003), Education Management Information System (EMIS), UNESCO, http://unesdoc.unesco.org/images/0015/001568/156818eo.pdf (accessed on 1 August 2018).

[22] World Bank (2016), Serbia Education Sector: Preliminary Findings of the Functional Review and Options for Reforms, World Bank, Belgrade, https://slideplayer.com/slide/10187898/ (accessed on 17 October 2019).

[20] World Bank (2012), Serbia SABER Country Report: Student Assessment, World Bank, Washington, DC, http://wbgfiles.worldbank.org/documents/hdn/ed/saber/supporting_doc/CountryReports/SAS/SABER_SA_Serbia_CR_Final_2012R.pdf (accessed on 12 May 2019).

[6] World Bank (2004), Ten Steps to a Results-Based Monitoring and Evaluation System, World Bank, Washington, DC, https://www.oecd.org/dac/peer-reviews/World%20bank%202004%2010_Steps_to_a_Results_Based_ME_System.pdf (accessed on 17 October 2019).

Metadata, Legal and Rights

This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Extracts from publications may be subject to additional disclaimers, which are set out in the complete version of the publication, available at the link provided.

https://doi.org/10.1787/225350d9-en

© OECD 2020

The use of this work, whether digital or print, is governed by the Terms and Conditions to be found at http://www.oecd.org/termsandconditions.