copy the linklink copied!7. Case study: Massachusetts’ (United States) Early Warning Indicator System (EWIS)

This chapter focuses on the role of integrated information systems in the public school system of the United States. In the United States, the education system is regulated at the state level. However, an accountability system exists between the state and the federal level based on education data, which makes sure that standards are achieved nation-wide. The case study looks at the Early Warning Indicator System (EWIS) in the state of Massachusetts. Based on various data sources, EWIS calculates individual probability levels of meeting certain predefined academic milestones for students from first to twelfth grade. The schools can use this information to direct specific support measures to the students at risk. This chapter introduces the technical details and distribution of responsibilities in the use of EWIS. Furthermore, it discusses how co-ordination across different levels of government can be achieved when establishing integrated information systems.

    

copy the linklink copied!Introduction

The state of Massachusetts is widely regarded as having the best public school system in the United States of America, scoring first in national student assessment tests. Massachusetts’ students also rank among the top internationally in reading, science and mathematics (OECD, 2016[1]). The media and public perception is that the success of Massachusetts’ education system is largely attributed to a significant educational reform in 1993, the Massachusetts Education Reform Act (MERA) (Rowe, 2016[2]; Baker, 2019[3]). However, the academic debate is more ambiguous about the effect of MERA on educational outcomes (Dee and Levine, 2004[4]; Guryan, 2001[5]; McDermott, 2004[6]). While Massachusetts’ schools are still predominantly controlled by local school districts, the reform redefined the partnership between districts and the state, and formally put the state government in charge of ensuring the equity and quality of education in Massachusetts’ schools. Today, schools are closely monitored by the state, especially those with significant opportunity and achievement gaps.

The educational reform resulted in a major increase in state funding for public schools, especially in low-spending districts, where total per-pupil revenues increased by 7% between 1993 and 1996 (Dee and Levine, 2004[4]). Additional state funding was particularly directed at schools in poorer areas to increase equality of opportunity among students. In exchange for receiving funding, schools were required to align their curricula to state-wide standards and to participate in a student assessment system, the Massachusetts Comprehensive Assessment System (MCAS). MERA also introduced an accountability system for school districts and schools. Low-performing schools are subject to state intervention and need to implement various measures to meet state standards. If their performance does not improve, they can be closed down by the state administration. Massachusetts’ accountability system is based on various data collection exercises that help assess the quality of schools, including MCAS results as well as other indicators, such as high school graduation rates and absenteeism. Over the years, Massachusetts has built a comprehensive information management system to assess students, schools and school districts.

This case study focuses on the Massachusetts’ Early Warning Indicator System (EWIS), which is part of the state’s information management system, although it is not used for accountability purposes. EWIS builds on existing state-wide data collection and uses statistical models to calculate the probability that a student will meet a predefined academic milestone. It assigns students a risk level according to these probabilities. EWIS gives information on the risk level for every student in Massachusetts from 1st to 12th grade. It helps schools and school districts identify students at risk of not meeting academic milestones so that they can provide them with additional support to meet these milestones. EWIS data are provided and administered by the Department of Elementary and Secondary Education (DESE). Schools and districts have access to the data, and using the data as part of an implementation cycle (Figure 7.3) is their responsibility.

The use of EWIS is not obligatory for districts and schools, which is a challenge. Schools and districts need to be made aware of EWIS and be given the knowledge to use the data. They also need financial and personnel support to implement the data effectively. In order to achieve these goals, DESE and the school districts work together to enhance data literacy among schools and help them establish support systems based on early warning indicators. Therefore, with reference to the overall framework of this report, this case study focuses on two of the four dimensions:

  • Promoting co-ordination, co-operation and collaboration across the whole of government.

  • Building integrated information systems.

The following section provides a short introduction to the technical details of EWIS and the involved actors at the federal, state and local level. The analysis that follows is based on the insights from interviews with stakeholder representatives and experts conducted in September 2019 in Massachusetts. The case study closes with a number of policy recommendations for the future development of governance regarding EWIS.

copy the linklink copied!Massachusetts’ education and training system

Massachusetts’ school system is widely regarded as the best public school system in the United States. Since 2005, the state has scored first in National Assessment of Educational Progress (NAEP) tests, a US-wide and representative academic achievement test, in 4th and 8th grade reading and mathematics (Commonwealth of Massachusetts, 2020[7]). Massachusetts was among the three US states, along with Connecticut and Florida, that asked the OECD for results at the state level in the 2015 Programme for International Student Assessment (PISA) and scored well above US and international averages, ranking 2nd in reading (along with Canada and Hong Kong), 6th in science (equal with Macau, China) and 20th in mathematics out of 72 countries (OECD, 2016[1]). In the US 2019 Best High Schools Ranking, Massachusetts was the state with the highest percentage of top-ranked public high schools in the United States (U.S. News, 2020[8]). Other education indicators also show that Massachusetts is among the top performing states within the United States. With a high school graduation rate of 88%, it ranks 12th among the 51 states and federal districts, above the US average of 85%. The school drop-out rate among 16 to 24 year-olds is 3.6% (US average 5.4%; 7th rank), the pupil-teacher ratio is 13.3 (US average 16.0; 11th rank) and the state spends USD 16 986 (US dollars) per pupil (US average USD 11 841; 7th rank) (National Center for Education Statistics, 2018[9]).

Despite this overall strong performance, a more detailed analysis of the data reveals striking differences between subgroups of students. Differentiating students by race and ethnicity shows that students of colour and students of Hispanic origin have much lower 8th grade NAEP results in mathematics than white students or students of Asian origin (Figure 7.1). Similarly, students from low-income families, students whose first language is not English and students with disabilities perform worse than students from high-income families, those without disabilities or those with English as their mother tongue (Figure 7.2). Similar results can be observed when comparing 4th grade reading performance on NAEP, PISA scores and MCAS scores by student group (Massachusetts Education Equity Partnership, 2018[10]). Race and ethnicity also determine the likelihood of graduating from high school. While the high school graduation rate of white students is 93%, only 80% of African American students and 74% of Hispanic students graduate from high school. Compared to other US states, Massachusetts performs below average in this regard. When comparing the graduation rates of students of different races across the 51 states and federal districts, Massachusetts ranks 7th for white students, 21st for African American students and 38th for Hispanic students (National Center for Education Statistics, 2018[9]).

copy the linklink copied!
Figure 7.1. Massachusetts 8th grade mathematics performance in NAEP, by race/ethnicity
Figure 7.1. Massachusetts 8th grade mathematics performance in NAEP, by race/ethnicity

Source: Massachusetts Education Equity Partnership (2018[10]), Number One for Some: Opportunity and Achievement in Massachusetts, https://number1forsome.org/wp-content/uploads/sites/16/2018/09/Number-1-for-Some-9.25-18.pdf.

 StatLink https://doi.org/10.1787/888934112785

copy the linklink copied!
Figure 7.2. Massachusetts 8th grade mathematics performance in NAEP, by socio-economic background
Figure 7.2. Massachusetts 8th grade mathematics performance in NAEP, by socio-economic background

Source: Massachusetts Education Equity Partnership (2018[10]), Number One for Some: Opportunity and Achievement in Massachusetts, https://number1forsome.org/wp-content/uploads/sites/16/2018/09/Number-1-for-Some-9.25-18.pdf.

 StatLink https://doi.org/10.1787/888934112804

Equity between students, as well as school districts, is a big concern in the education system of Massachusetts. In the United States, every public school belongs to a school district, which is an independent local administration responsible for various issues concerning primary and secondary education.1 In Massachusetts, there are 1 850 public schools in 525 school districts. School politics first and foremost takes place at the district level. Among other things, districts are responsible for the maintenance of school buildings, the selection of curricula materials (e.g. books), and for ensuring that schools comply with federal and state law. School districts also provide a large share of public school financing and collect property taxes for this purpose. In Massachusetts, 57% of public school revenue comes from local (school district) sources (US average 44.9%) while only 38.7% comes from the state (US average 47.1%) and 4.3% from the federal budget (US average 8%) (US Census Bureau, 2018[11]). The heavy reliance on local taxes creates inequalities, as richer districts generate higher tax revenues than poorer districts.

Massachusetts started to tackle the problem of inequalities in 1993 with MERA, through which the state government took responsibility for establishing and ensuring a minimum level of educational spending for all districts (the so-called “foundation budget”) for the first time. The foundation budget is determined in three steps. First, it is calculated how much money each school district needs, which depends on the number of pupils (including kindergarten), the number of students who are economically or socially disadvantaged, and a price differential that accounts for regional differences in cost of living. Second, depending on property values, each district’s local contribution is determined. Third, state aid is calculated by adding the funding needs of all schools within a district derived from the needs-based formula in step one and subtracting this sum from the district’s local contribution calculated in step two. The state pays the difference between the need and the local contribution. State aid can never be lower than in the previous year (Lee and Blagg, 2018[12]), and districts can contribute more than the foundation budget without losing state aid. On average, districts’ school spending is 26% above the foundation budget, and in some districts is more than 300% of what has been calculated in the formula (Lee and Blagg, 2018[12]). Because only wealthy districts can afford to spend more than expected by the formula, this funding formula has increasingly been criticised by policy makers and stakeholders for privileging wealthy districts over disadvantaged districts that have many economically disadvantaged students. As a consequence, the state has been undertaking an evaluation of school funding since 2015 (Chester, 2014[13]). In September 2019, the Senate and the House of Representatives of Massachusetts announced that they would reform the funding formula and give an additional USD 1.5 billion to underprivileged schools (Vazins and Stout, 2019[14]).

In exchange for increased funding, the state wanted to increase its influence in schools, and therefore implemented assessment and accountability measures in Massachusetts’ primary and secondary education system. It established MCAS at the student level, which is an annual academic assessment test of each student from 3rd to 10th grade in mathematics, reading and science. The test is based on the curriculum framework of Massachusetts, which was introduced for the first time by MERA. MCAS results are very important for schools and districts as they are used by parents and policy makers to monitor quality and performance. As a consequence, it has been observed that schools narrow the curriculum to the subjects assessed in MCAS. In a recent report, the Massachusetts Commissioner of Elementary and Secondary Education criticised that “in too many cases, they have seen the curriculum narrowed to focus on assessed subjects or shallow coverage of content in a rush to cover all standards before MCAS testing. They also reported instances of too much time spent drilling students on tested skills, divorced from a cumulative, meaningful learning context” (Riley, 2019[15]).

In addition to the assessment of students, MERA established an accountability and assessment system for schools and districts. Based on various indicators such as MCAS results, high school completion rate and the percentage of chronic absenteeism, it provides information on how each school is doing compared to other schools, and on progress towards meeting certain targets. Each school is assigned to one of six groups, ranging from “needs broad comprehensive support” to “schools of recognition” (DESE, 2020[16]).Together with the district, the state intervenes in every school that belongs to one of the two lowest performing groups. Examples for interventions are the expansion of the school day to increase the number of school lessons, the replacement of the curriculum to improve teaching content, and the suspension of collective bargaining agreements to push back the influence of teacher unions (DESE, 2011[17]). In very rare cases, schools are closed.

Massachusetts’ accountability system is data driven. DESE reports on data collected by the state and school districts through a variety of tools and reports. The DESE website lists 11 data reports created from various data collections which cover drop-out rates, teacher assessment data, enrolment data, enrolment in institutions of higher education, grade retention reports, graduation rates, MCAS results, mobility rates, per pupil expenditure reports, plans of high school graduates and student growth percentile. Districts and schools also often compile additional local data that are not reported to the state.

The reforms of Massachusetts’ education system following MERA must be understood within a broader nationwide debate on the US education system. Ten years before MERA, a federal commission published the report A Nation at Risk that described the failure of American public schools and demanded higher academic standards. The report was followed by considerable discussions on the public education system, resulting in the 2001 No Child Left Behind Act (NCLB). This act expanded the federal role in public education, in particular by establishing a comprehensive assessment and accountability system in exchange for additional funding from the federal level. States that wanted to receive federal money were obliged to implement state-wide academic assessment tests and to control school quality. The 2015 Every Student Succeeds Act (ESSA) relieves the states from such strict control of the federal level. However, states are still required to implement student assessment tests and to evaluate the quality of schools using quantitative indicators.

With the NCLB, the United States underwent a policy shift towards federal involvement in public schools. This was possible because the position of Democrats and Republicans on education policies converged in the 1990s. While Republicans abandoned their opposition to a federal role in education, Democrats’ focus shifted from input and equity concerns to standards and accountability (McGuinn, 2005[18]). As a consequence, the NCLB established greater federal investment in exchange for school and state accountability based on quantitative data assessments. Thus, the NCLB and ESSA had a profound impact on the US public education system, and introduced data-driven assessment and accountability systems in the majority of states for the first time. However, for Massachusetts these systems were not new, as they had been already implemented with MERA. Nevertheless, the federal educational reforms intensified the role that accountability measures played in Massachusetts and accelerated the development of information management systems.

copy the linklink copied!The Early Warning Indicator System

Massachusetts’ Early Warning Indicator System (EWIS) was established in 2011 to help identify students at risk of not meeting certain academic milestones from 1st to 12th grade. It was established after Massachusetts received a federal grant as part of the Statewide Longitudinal Data Systems (SLDS) grant programme. The SLDS programme was one response to the accountability provisions of the 2001 NCLB and was authorised by the Educational Technical Assistance Act of 2002. It awards grants to states to enable them to adequately collect, manage and use educational data. Massachusetts’ application for the 2009 SLDS grant included the expansion of its existing Early Warning Indicator Index (EWII), which evaluated the probability of students entering 9th grade, which is the first grade in high school in the United States, not graduating high school on time. EWII was criticised for its limited methodology and scope: “The current methodology was developed by DESE using a limited set of criteria and a methodology that is not as research based as it needs to be. In addition, the current reports do not go beyond grade nine or identify students that are primed for additional academic opportunities” (US Department of Education, 2009[19]). This was criticised by school districts, which argued that schools were receiving information about at-risk students too late.

In order to improve its early warning indicator system, Massachusetts wanted to use the SLDS grant to establish a broad data information system that predicts the likelihood of all students in every grade meeting certain key academic milestones, such as graduating from high school. The grant enabled the state to collaborate with the American Institutes for Research (AIR), a non-profit behavioural and social science research association, in developing EWIS. AIR and DESE began by conducting a literature review on early warning indicators and observing early warning systems in other states and districts in the United States. They compared the findings with the indicators used in EWII and suggested a design for EWIS. Following this, a series of multilevel models were tested to identify the variables that best predict the likelihood of students failing key academic benchmarks (American Institutes for Research, 2020[20]). In summer 2012, DESE provided risk data to the districts for the first time. While some aspects of the model on which EWIS is based have been updated since its development, and the data in the model are updated annually, the main statistical assumptions have stayed the same. Following a second SLDS grant, EWIS has been expanded to post-secondary education, with risk levels predicting post-secondary outcomes first available for districts in 2016. The establishment and development of EWIS in Massachusetts was solely financed by federal money received through the SLDS grants in 2010 and 2015.

Massachusetts decided to base its early warning system on data-based indicators rather than on more qualitative sources such as teachers’ knowledge about their students for two main reasons: first, SLDS grant holders are required to establish and use data-based information systems, and second, Massachusetts did not wish to introduce new data collection exercises (and corresponding burdens) to schools and districts, and local teacher level knowledge did not exist within the state data system. With this decision, Massachusetts also complied with education policy decisions related to MERA regarding the importance of quantitative accountability indicators, as well as the focus of federal education policies on data-based information management systems since the NCLB. Despite this, since EWIS was established there have been discussed regarding how it adds value compared to the knowledge teachers have of their students. This will be further discussed later on in the case study.

Technical details

EWIS predicts the risk of individual students failing to meet relevant academic milestones without additional support. There are three risk levels: low, moderate and high (Table 7.1). These risk levels indicate “whether a student is currently on track to reach the upcoming academic milestone” (DESE and AIR, 2014[21]). EWIS uses data from the previous school year to determine the risk levels. The risk levels do not represent a relative measure but are calculated based on the individual student’s performance. It is thus possible that all students in a specific grade or even school are in the low-risk category.

copy the linklink copied!
Table 7.1. Student risk levels

Low risk

Likely to reach the upcoming academic milestone.

Approximately 90% of students who are at low risk will meet this academic milestone within each age group.

Moderate risk

Moderately at risk for not reaching the upcoming academic milestone.

Approximately 60% of students at moderate risk will meet this academic milestone within each age group.

High risk

At risk for not reaching the upcoming academic milestone.

Approximately 25% of students at high risk will meet this academic milestone within each age group.

Source: Massachusetts Department of Elementary and Secondary Education and American Institutes for Research (2014[21]), Early Warning Implementation Guide: Using the Massachusetts Early Warning Indicator System (EWIS) and Local Data to Identify, Diagnose, Support, and Monitor Students in Grades 1-12, www.doe.mass.eduhttp://www.earlywarningsystems.org.

EWIS organises student risk by four grade level groupings: early elementary (grade levels 1-3), late elementary (grade levels 4-6), middle grades (grade levels 7-9) and high school (grade levels 10-12). The grade levels come with different academic milestones (Table 7.2). For example, in the early elementary age group, students need to meet requirements in English language reading and understanding, whereas students in the late elementary group also need to meet certain standards in mathematics. The standards are related to MCAS. High school students are evaluated on four academic milestones: high school graduation, college enrolment, academic readiness and college persistence, with the last three indicators measuring the college and career readiness of students (post-secondary level). It is therefore possible that high school students have different risk levels. For example, an 11th grade student could have a low risk of missing high school graduation, but a moderate risk of missing college enrolment.

EWIS risk levels are calculated by a regression model using different indicators that are validated and updated annually. The risk model was originally developed by AIR and DESE and is continuously updated by data experts at DESE. According to interview partners at DESE, the indicators need to fulfil certain preconditions. First, they need to fulfil the requirements of a rigorous statistical model. For this, researchers from inside and outside DESE regularly assess the validity, goodness of fit and specificity of the updated EWIS models. Second, for each indicator the data must be available for every student. Third, EWIS demands that no new data need to be produced, but that existing state-wide data collections are used. Data sources include the Student Information Management System (SIMS), the Student Course Schedule (SCS) (courses taken by students), the School Safety Discipline Report (SSDR) (criminal offences and discipline actions at schools), data from MCAS and English language proficiency tests (DESE and AIR, 2014[21]).

copy the linklink copied!
Table 7.2. Age groups, grade levels, and academic milestones

Age group

Grade levels

Academic milestones

K12

Early elementary

1-3

Reading by the end of grade 3.

Meeting or exceeding expectations on the grade 3 English language arts (ELA) state assessment.

Late elementary

4-6

Middle school ready.

Meeting or exceeding expectations on the grade 6 ELA and mathematics state assessment.

Middle grades

7-9

High school ready.

Passing grades on all grade 9 courses.

High school

10-12

High school graduation.

Completing high school graduation requirements in four years.

Post-secondary

College enrolment.

Enrolling in post-secondary education.

Academic readiness.

Enrolling in credit-bearing courses without developmental education.

College persistence.

Enrolling in a second year of post-secondary education.

Source: DESE and AIR (2014[21]), Early Warning Implementation Guide: Using the Massachusetts Early Warning Indicator System (EWIS) and Local Data to Identify, Diagnose, Support, and Monitor Students in Grades 1-12, www.doe.mass.eduhttp://www.earlywarningsystems.org.

The indicators EWIS use include information about the school-student relationship, for example attendance rate, suspensions or school moves; biographical information on the student, for example gender or need for special education; and the results of state-wide assessment tests for different academic subjects. As EWIS risk levels depend on the age group, different indicators are used for different age groups (Table 7.2). For example, to assess the college and career readiness of a high school student (post-secondary level), SAT (Scholastic Assessment Test) and AP (Advanced Placement) scores are included, as well as if a student learns a foreign language at school (DESE and AIR, 2014[21]). Similar to other education data reports, EWIS data are published on an online secure data platform called Edwin Analytics (or Edwin), where reports can be accessed by authorised users at the school, district and state level.

There is no mandatory way of how schools and districts should implement EWIS locally. However, DESE suggests that EWIS is implemented in a six-step process over the school year (Figure 7.3). In this process, schools are advised to start with putting a team together (step 1) and reviewing EWIS data (step 2) at the beginning of the school year. In the third step, information provided by EWIS data and the experience and knowledge of educators are combined to explore underlying causes for the poor performance of a student (step 3). The school should then implement additional support measures for the specific student (step 4), which are to be evaluated (step 5). These steps three to five are to be repeated throughout the school year. At the end of the school year, DESE advises schools to use the insights they have gained throughout the school year to summarise the successes and challenges of the early warning process and refine the process (step 6). Thus, reflecting and revising is an important part of the early warning implementation cycle.

copy the linklink copied!
Figure 7.3. The early warning implementation cycle
Figure 7.3. The early warning implementation cycle

Source: DESE (2020[22]), Early Warning Implementation Cycle, http://www.doe.mass.edu/ccte/ccr/rlo/ewis/story_html5.html.

Responsibilities

Different levels of government have been or still are involved in the establishment, administration and implementation of EWIS (Table 7.3). In order to establish and further develop EWIS, Massachusetts successfully participated in two rounds of the federal SLDS grant programme. This programme, established in 2005, is a competitive programme that is used as a means to help states develop, improve and efficiently use data management systems to improve student learning and outcomes and to facilitate research on student achievement. Out of the six grant rounds since 2005, Massachusetts has been successful three times. The establishment of EWIS was part of the 2009 grant round that awarded nearly USD 13 million to Massachusetts. EWIS is listed as one of six measures in Massachusetts financed by the SLDS grant: “A more robust and nuanced risk and opportunity identification methodology is implemented by DEEC (Department of Early Education and Care) and DESE that starts at birth and continues through high school that more precisely identifies students at risk of dropping out and students who are ready for more rigorous academic course work” (US Department of Education, 2009[19]). In 2015, Massachusetts received another SLDS grant of around USD 7 million. Parts of this grant were used to expand EWIS to predict readiness for post-secondary education: “As a result of this grant, Massachusetts school districts will know which of their students are on track for success in post-secondary education in time to intervene if needed” (DESE, 2015[23]).

copy the linklink copied!
Table 7.3. Distribution of responsibilities at different levels of government

Federal

Institute of Education Sciences

Awards SLDS grants

State

Department of Elementary and Secondary Education

Collects data

Develops model

Engages in public relations

Provides technical assistance

Substate

School districts

Schools

Decides on data access (districts)

Implements early warning cycle (districts/schools)

The implementation of the SLDS grant is the responsibility of the state government, although it must report annually to the federal level. As described earlier, Massachusetts co-operated with AIR in the establishment and administration of EWIS; however, AIR is no longer involved. DESE annually updates the model on which EWIS is based, publishes the results on Edwin Analytics, works to make EWIS known to school districts, and provides technical assistance in school districts that use EWIS data.

Massachusetts has not received SLDS funds since September 2019, and the state now finances EWIS solely by its own resources. The number of staff responsible for EWIS-related activities has decreased considerably. According to interview partners at DESE, data analysts update the data and the model itself annually and publish the results on Edwin. In addition, DESE still provides school districts with a contact person if there are questions on the usage of EWIS. The amount of work for these tasks differ across the year. For example, requests from districts mostly arise in August and September when new data have been released. Interview partners at DESE estimated that the number of DESE employees working on EWIS equals about one to two full-time equivalents on average across the year.

DESE provides access to students’ individual EWIS data for public school districts. However, it is the decision of the districts if and how they use the data, and who has access within the district. According to interview partners at DESE, about 80% of districts accessed EWIS data on Edwin in the school year 2018/2019. It can be assumed, though, that the number of districts that have really used the data – for example to get an overview on school performance or to provide data for individual schools – is significantly smaller. DESE does not control data usage by the districts; however, during the field research for this case study the impression was gained that most educators and school officials do not know about EWIS. DESE is aware of this problem and has implemented several measures to increase knowledge of EWIS:

  • Created a website (DESE, 2020[24]) that offers information about what EWIS is and resources on how the data can be accessed and used. In addition, the website provides hands-on reports of districts and schools that use EWIS.

  • Publishes a monthly email newsletter with news about EWIS as well as other data-related state resources (DESE, 2019[25]).

  • Offers training on EWIS. Annually, there is one large event where districts and schools can learn about EWIS, as well as several smaller events and webinars.

  • Works with the Massachusetts School Counsellors Association (MASCA), which offers professional development training to educators, to offer a series of courses on EWIS. Participants can receive professional development points, which they need for licensing in Massachusetts.

  • Given Early Warning Implementation Grants to ten schools and school districts to enhance data use throughout the school year. Grant holders received additional support in order to implement the early warning implementation cycle (Figure 7.3) at their schools. Grant holders met several times to exchange their experiences of EWIS.

Interviews with various holders of the Early Warning Implementation Grant show that there is a high interest in early warning indicators as it helps districts and schools identify students at risk using reliable quantitative data. However, during interviews it became apparent that districts and schools use EWIS in very different ways. One of the interviewed schools uses EWIS to learn about the educational biography and risk levels of students who were not pupils at the school in the previous year. Since EWIS provides longitudinal information about individuals, it is a very good source of information for this purpose. Other interview partners use the grant to implement the early warning implementation cycle (Figure 7.3) and to establish a team that works on their own early warning indicators. Thereby, they add school-related data such as school assessment tests (besides MCAS) and missed classes (instead of missing school days) to EWIS to account for more local needs. Interviewed grant holders appeared to have the impression that EWIS supports and even assists this flexible usage of the grant, as long as schools and districts use data-based systems to identify students at risk and direct additional support measures to them: “DESE was pretty explicit that there are multiple ways you could use the data”.

copy the linklink copied!Analysis

Opportunities for innovative governance reforms

Data-based information systems have the potential to inform policy makers, schools, students and parents about educational and labour market trajectories. Massachusetts has recognised the potential of information systems and collects a large variety of data-based indicators on schools’ and students’ performance and behaviour in different databases. EWIS does not generate new data but builds on this existing data. The added value of EWIS therefore needs to be discussed, and it is important to know what state authorities, districts and schools learn from EWIS that they do not already know from other data sources or their personal experiences with students.

There are several ways in which EWIS adds value. First, it is the only data-based information system that uses a model that calculates students’ individual risk level of failing to meet a predefined academic standard. It is a very comprehensive system as it collects longitudinal data for every student in Massachusetts from 1st to 12th grade, and makes predictions regarding the college and career readiness of high school students. EWIS is sensitive to the age groups of students and uses different indicators and predefined academic milestones for four age groups. The data are updated annually, and DESE regularly evaluates the statistical fit of the model with the help of internal and external experts, and updates it if needed. During the interview with a responsible for the statistical model, it was noted that DESE pays particular attention to the way in which the indicators are used. Overall, DESE has developed a comprehensive, age-sensitive statistical model that rather accurately predicts the likelihood of students meeting or failing to meet academic goals. This is confirmed by representatives of the schools and districts interviewed who said that the students they perceive as having difficulties at school are the same identified as being high risk in EWIS. Thus, theoretically EWIS should allow teachers, principals and career counsellors to be able to tell easily if and in what area a student needs additional support. There will be further discussion on how practitioners use the data later on in the chapter. EWIS users can access information on all indicators that are part of the risk model such as the students’ results in state assessment tests or their attendance rates. Therefore, schools and districts will not only be able to identify students at risk, but they will also be able to assess the reasons for potential academic failure.

Second, EWIS builds on the personal interaction between students and schools. Data-based information systems are often criticised because they only consider students’ academic performance and can thereby lead to a biased assessment of students by educators. The early warning implementation cycle used by Massachusetts shows, however, that EWIS very much takes into account students’ personal circumstances, for example if parents get divorced. EWIS provides schools with an overview of where their students perform well academically and where they do not. It is for the schools to decide on support measures that increase the academic performance of students based on their personal situations. Thus, when identified as being at high risk by EWIS, educators, counsellors, the student and sometimes even the parents discuss together which support measures are needed to improve the student’s performance. The personal relationship between students and educators and the background information educators have on their students are therefore an important part of the data cycle.

Given that the risk levels EWIS predicts correspond with the perceptions of schools regarding which of their students are high risk, it may be questioned how EWIS can add to the knowledge schools have obtained in their day-to-day interactions with students. However, schools do not know their students to an equal degree. As one interview partner emphasised: “Easy kids are those who misbehave. Everyone knows them. The kids who are hard to assess are the quiet ones.” The data EWIS provides help schools to have better knowledge of those students who are not disruptive in class but who nevertheless have (academic) problems. Through EWIS, schools can also receive longitudinal information on new students who have moved from one school in Massachusetts to another. For one of the schools visited as part of this case study, this was the main application of EWIS. EWIS data are also very valuable for school districts. Instead of having to look at different databases to assess school performance in, for example, state-wide assessment tests and missed lessons, EWIS provides a broad overview of the performance of students in different schools.

Third, how EWIS is administered contributes to how it adds value. EWIS is based on existing state-wide data collections and does not require new data to be collected. This makes the administration of the system much easier – for DESE as well as for the districts that provide data to the state. In addition, the fact that data management is done within the state’s administration allows for quick reactions to changing demands to the model. DESE can easily update the statistical model as well as the data because the expertise to do so is in-house. One interview partner noted that Massachusetts is an exception in this regard, as responsibilities for the statistical model of education data information systems in most other US states are usually outsourced to for-profit organisations. Furthermore, the provision of EWIS data does not consume considerable financial and human resources at DESE, which is important as the federal SLDS grant expired in 2019. The comparatively inexpensive management of EWIS ensures that the early warning system will continue after 2019.

Fourth, the voluntary nature of EWIS can be seen as an advantage. DESE provides EWIS data to every district, but districts and schools are not forced to use the data in their work. As interview partners confirmed, there are some challenges regarding the relationship between DESE and school districts, as although districts enjoy autonomy in many regards, their performance is observed and assessed by DESE. In the case of EWIS, though, DESE is perceived by the interviewed districts and schools as a partner in supporting students to maximise their educational potential. Interview partners confirmed that they could contact the responsible persons at DESE if questions arose, and that they were very helpful and gave a lot of advice – not only on the use of EWIS but also on other data-related questions, such as the implementation of data cycles on the ground or the establishment of school- or district-specific databases. The provision of a large number of information resources on an EWIS-specific website, the launch of the EWIS newsletter, collaboration with MASCA and the offer of training show that DESE wants to raise awareness among districts of the early warning system. DESE is very open to various types of data usage as long as districts and schools build up an effective system that helps to identify and support weaker students. Thus, in the case of EWIS all public authorities in Massachusetts’ education system work together as partners to the benefit of the students in the state.

To sum up, EWIS is a very helpful tool to identify students that need extra support to reach academic success. It provides a very simple outcome variable – the probability that a student will meet a certain predefined academic goal – but also includes information on a large variety of indicators that can explain this outcome. In addition, EWIS is administered in an efficient and (cost-) effective way as it is done in-house and does not require new data collections. It can therefore be regarded as sustainable in terms of resources, even after the initial SLDS grant expired. Nevertheless, there are several challenges related to EWIS, which are detailed in the next section.

Ongoing challenges

The biggest challenge Massachusetts faces is to increase the number of EWIS users among districts and schools. It is not possible to indicate how many districts and schools use EWIS in their work besides the ten grant receivers because DESE does not collect this information. As described in the section on EWIS, DESE has launched several initiatives to increase the number of districts and schools that use EWIS: the website and newsletter aim to increase general knowledge about EWIS, the collaboration with MASCA offers training events to train people in actually using EWIS, and the grants bring a broader perspective on data cycles to the grant receivers and help them connect with each other. Nevertheless, in terms of data usage by districts and schools, further improvements could still be made. One of the key questions is why EWIS is still receiving little attention in some districts and schools seven years after the first data round was released.

There are several potential reasons for this. First, the information DESE provides on EWIS is not targeted to users. The website contains a large number of support resources, but the sheer number could create confusion and make it difficult for users to find what they need. In addition, the needs of the users differ largely. Some districts and schools have been using the data implementation cycle before EWIS and only need information on specific technical issues, which is hard to find online. Others have never used data information systems before and are overwhelmed by the wealth of information on the website or in the newsletters. DESE tries to better target the information by offering substantial individual support. This is helpful for those districts and schools that already use EWIS, but it makes it more difficult to reach out to other potential users.

Second, EWIS suffers from a certain lack of knowledge in districts and schools on how data information systems can help educators, counsellors and districts to improve the academic performance of students. According to DESE, EWIS should be used to identify students at risk and provide them with support measures so that they meet their academic milestones. It is therefore not enough to have a quick view of the data. Rather, EWIS data are displayed in a large excel sheet with several columns that give information about students’ results in the various indicators. It takes time and knowledge to process and find ways to deal with this information. In order to use EWIS effectively, staff in schools, for example principals or career counsellors, are needed to implement the early warning implementation cycle on site, i.e. to identify students at risk, explore the underlying causes, and assign and evaluate interventions. This process is time consuming, and these staff members will have a number of other obligations that they need to fulfil. Therefore, schools might see EWIS as a nice add-on to their regular work, but it might not be their priority. This is especially the case if schools are not familiar with or know how to use data. Newsletters that provide information on EWIS do not reach these schools.

It can be argued that it is the task of school districts to reach out to the schools and let them know about EWIS. The Early Warning Implementation Grants were directed to districts rather than individual schools, and access to data in Edwin is determined at the district level. However, EWIS is a non-coercive initiative and the districts cannot be forced to use it. If districts know about the value of information systems to their work, and if they are familiar with the usage of data, they will be more likely to embrace a system such as EWIS than districts not used to information systems. The competitive early implementation grants of DESE have enforced inequality between school districts regarding the usage of EWIS: the interviews conducted by the OECD team showed that the successful districts had prior knowledge about and experience in information management systems, for example because they have used other data systems before or because individual district employees are data specialists. Districts that do not have this knowledge have probably not taken notice of the grant call and have not applied.

Thus, schools and districts with a prior data usage culture are significantly more likely to use EWIS because they recognise its value for their work. For DESE (i.e. the state government), the challenge is to devise policies that encourage schools that use EWIS to use the system more effectively, as well as expand the number of schools that use EWIS in the first place.

As mentioned previously, during interviews it became clear that even districts and schools that implement EWIS and the early warning implementation cycle use do so in very different ways. According to these users, this is because the data EWIS contain are – naturally – limited. EWIS uses data from various state-wide databases, but these data are not always tailored to the needs of the schools. For example, an interview was conducted with a school which realised that to determine students at risk it was not the number of missed school days (EWIS data) that was decisive, but the number of missed classes. In addition, DESE releases the EWIS data annually in August, whereas schools collect data on their students throughout the year. As a consequence, several schools and districts have established their own data information systems with indicators more targeted to local needs.

DESE does not know how districts and schools use EWIS data because there has been no evaluation conducted. This is part of a larger problem. Since the end of the federal SLDS grant in 2019, EWIS has had to cope with very limited financial resources. Interview partners confirmed that the state spends enough money on EWIS to keep it running, with sufficient resources for the provision of data, annual updates of the model and to provide some support to districts and schools, such as training and webinars. However, more expensive activities to increase the number of users, such as a second round of early implementation grants or improving the quality of EWIS through comprehensive evaluation, cannot be financed through the state funds provided for EWIS.

The final main challenge of EWIS concerns data privacy. EWIS data are available at the individual student level, which increases the importance of complying with data privacy rules. DESE ensures that the servers on which data are stored are safe from external attacks. However, its decentralised set-up makes EWIS prone to internal violations of data privacy rules. DESE can only partly control who has access to the data because school districts decide on this issue. Therefore, compliance with data privacy rules is difficult for DESE to control. The risk that data privacy is (unintentionally) violated is increased by the fact that the experience of EWIS users with data information systems varies significantly. Interestingly, compliance with data privacy rules was not perceived as a big issue during interviews, perhaps because data privacy plays a less important role in the United States than in Europe (Heisenberg, 2005[26]).

Summary

To sum up, EWIS relates to two dimensions of governance: co-ordination across levels of government and building integrated information systems. EWIS was developed through the support of a federal grant (the SLDS grant programme). It is administered by DESE and implemented in the 525 public school districts and their schools in Massachusetts at the local level. EWIS is a non-coercive measure, which means that districts and schools are not required to use it. The advantage of this is that DESE and the districts act as partners to boost the educational performance of students. However, it also requires DESE to invest in the training of district administrators and educators so that they can effectively and efficiently use the data EWIS provides to improve the academic performance of Massachusetts’ students. This includes various aspects, such as the implementation of the data cycle and compliance with data privacy rules.

EWIS is also an example of how to build integrated information systems. It provides a statistical model that calculates the individual risk level of students from 1st to 12th grade in meeting certain predefined academic milestones. It also offers insights into factors that could potentially explain why students fail to meet these milestones. EWIS relies on multiple quantitative data sources from existing state-wide collections. As a consequence, the administration of EWIS is comparatively easy as no new data need to be collected. The responsibilities between the various actors engaged in EWIS, most importantly DESE and districts and schools, are clearly defined. However, EWIS still struggles with the fact that the needs and demands of its users differ.

copy the linklink copied!Policy recommendations

Based on the analysis of the strengths and weaknesses of Massachusetts’ EWIS, this section presents a number of policy recommendations specific to Massachusetts’ early warning system. In order for schools to use EWIS, the following three conditions must be met: 1) schools and districts must be aware of the advantages that the system can bring into their work; 2) schools and districts must have the knowledge to work with data; and 3) schools and districts must have financial and personnel resources to use EWIS. The following policy recommendations for the future of EWIS follow from this analysis. More general policy recommendations are developed and presented in the final chapter of this report.

Establish a monitoring system to constantly evaluate EWIS

EWIS is a complex system that entails several elements: a statistical model that calculates risk levels for every student in Massachusetts, a variety of indicators compiled from different state-wide data collections, and the early warning implementation cycle that makes suggestions on how to use the information provided by EWIS in schools. EWIS involves different levels of government. While the division of responsibilities between the federal Institute of Education Sciences that awarded the two SLDS grants to Massachusetts and the state’s DESE is clear, this is not necessarily the case for interactions between DESE and districts and schools due to the decentralised nature of EWIS. EWIS is a non-coercive measure that supports districts and schools to identify and support academically weak students. But its use cannot be enforced by DESE.

Since its establishment in 2012, the statistical model on which EWIS is built has been evaluated and updated several times. However, no monitoring takes place on the usage of EWIS. DESE can keep track of the number of downloads of EWIS data from the data platform Edwin. However, EWIS is much more than a spreadsheet of data: the data need to be interpreted, and consequences for academically weak students need to be implemented and evaluated. Unfortunately, DESE currently does not know how many districts and schools implement EWIS and for which purpose the districts and schools use the data. To ensure that EWIS contributes to its original goal, namely supporting students at risk of not meeting age-related academic goals, Massachusetts should implement a monitoring system that regularly evaluates the use of EWIS by districts and schools, for example the number of schools that use the early warning implementation cycle every year.

Empower schools and districts to work with information management systems

Although there is no reliable information on the districts and schools that use EWIS, it can be assumed that the number is not very high. DESE has tried to raise awareness about the advantages of EWIS through various initiatives, including a website, newsletters and several events throughout the year for districts and schools. Through the early implementation grant, DESE supported ten districts and schools that were keen to implement the data cycle EWIS builds on. However, EWIS suffers from the fundamental problem that many potential users lack the knowledge and skills of how to appropriately use the available data. Data need to be interpreted and analysed, which means that users have to know how the data has been generated and processed. Users also need to possess at least some skills in working with information systems. However, data literacy is lacking in many of Massachusetts’ districts and schools.

It is therefore not enough to provide specific information resources on EWIS. Rather, potential users of EWIS need to be enabled and empowered to use data information systems in an efficient and effective way. The collaboration between DESE and the Massachusetts Schools Counsellors Association is a first step towards reaching this goal. Together, they offer courses on how to use EWIS in schools and award professional development points that school counsellors in Massachusetts need for licensing. Similar initiatives could be established with other stakeholders such as the Massachusetts School Administrators’ Association or teacher unions. Additionally, Massachusetts could make data literacy a mandatory aspect of the curricula of prospective teachers and student career counsellors. For this, more resources will be needed, for example for additional early implementation grants or the training of prospective users. Districts must also ensure that schools are adequately staffed with people who have the time and knowledge to use EWIS, for example guidance counsellors or school psychologists.

Improve the usability of EWIS

Even if users have the knowledge and skills to use data information systems, they might have problems using EWIS because its usability is limited. EWIS data come in a large spreadsheet with the columns showing the risk levels, the indicators on which the risk model builds and additional socio-economic variables; and the rows listing the values of each student in every variable. The dataset can be found at Edwin and downloaded as an Excel file for further analysis. However, for inexperienced users the huge dataset is not easy to handle. DESE and policy makers could consider how to improve usability, for instance by developing transparent and easy-to-use tools for the Edwin website that allow for simple analyses of the data.

The information resources that DESE provides to users are also not very targeted. The website contains a large number of links and a great deal of information. A cleaner design that guides users directly to the resources they need could help considerably. The newsletter could also be better used, for example, instead of announcing that a new round of EWIS data has been released, DESE could promote that it has released new data that help to improve students’ academic performance. The new strategy of DESE to let users of EWIS report on their experiences is another way to improve its public relations activities.

References

[20] American Institutes for Research (2020), Massachusetts Early Warning Indicator System, American Institutes for Research, Washington, DC, https://www.air.org/project/massachusetts-early-warning-indicator-system (accessed on 9 February 2020).

[3] Baker, C. (2019), An Education Promise We Can Keep, Boston Globe, Boston.

[13] Chester, M. (2014), Building on 20 Years of Massachusetts Education Reform, Massachusetts Department of Elementary and Secondary Education, Malden, MA.

[7] Commonwealth of Massachusetts (2020), Massachusetts NAEP Results Lead Nation for 12th Year | Mass.gov, https://www.mass.gov/news/massachusetts-naep-results-lead-nation-for-12th-year (accessed on 9 February 2020).

[4] Dee, T. and J. Levine (2004), “The fate of new funding: Evidence from Massachusetts’ education finance reforms”, Educational Evaluation and Policy Analysis, Vol. 26/3, pp. 199-215.

[16] DESE (2020), District and School Accountability Systems, Massachusetts Department of Elementary and Secondary Education, Malden, MA, http://www.doe.mass.edu/accountability/ (accessed on 9 February 2020).

[22] DESE (2020), Early Warning Implementation Cycle, Massachusetts Department of Elementary and Secondary Education, Malden, MA, http://www.doe.mass.edu/ccte/ccr/rlo/ewis/story_html5.html (accessed on 9 February 2020).

[24] DESE (2020), Early Warning Indicator System (EWIS), Massachusetts Department of Elementary and Secondary Education, Malden, MA, http://www.doe.mass.edu/ccte/ccr/ewis/ (accessed on 9 February 2020).

[25] DESE (2019), EWIS Distribution List, Massachusetts Department of Elementary and Secondary Education, Malden, MA, https://us14.campaign-archive.com/home/?u=d8f37d1a90dacd97f207f0b4a&id=95b57784c7 (accessed on 9 February 2020).

[23] DESE (2015), 2015 Massachusetts SLDS Grant Application, Massachusetts Department of Elementary and Secondary Education, Malden, MA.

[17] DESE (2011), Description of M.G.L. Ch 69, Section 1J: An Act Relative to the Achievement Gap Process for “Underperforming” Schools, Massachusetts Department of Elementary and Secondary Education, Malden, MA, http://www.doe.mass.edu/turnaround/level4/level-4-legislation.pdf (accessed on 21 February 2020).

[21] DESE and AIR (2014), Early Warning Implementation Guide: Using the Massachusetts Early Warning Indicator System (EWIS) and Local Data to Identify, Diagnose, Support, and Monitor Students in Grades 1-12, Massachusetts Department of Elementary and Secondary Education; American Institutes for Research, Malden, MA, http://www.doe.mass.eduhttp://www.earlywarningsystems.org. (accessed on 9 February 2020).

[5] Guryan, J. (2001), “Does money matter? Regression-discontinuity estimates from education finance reform in Massachusetts”, National Bureau of Economic Research NBER Working Paper No. 8269, https://doi.org/10.3386/w8269.

[26] Heisenberg, D. (2005), Negotiating privacy : the European Union, the United States, and personal data protection, Lynne Rienner Publishers, Boulder, Colorado.

[12] Lee, V. and K. Blagg (2018), School District Funding in Massachusetts: Computing the Effects of Changes to the Chapter 70 Funding Formula, Urban Institute, Washington DC.

[10] Massachusetts Education Equity Partnership (2018), Number One for Some: Opportunity and Achievement in Massachusetts, Massachusetts Education Equity Partnership, Massachusetts.

[6] McDermott, K. (2004), “Incentives, capacity, and implementation: Evidence from Massachusetts education reform”, Journal of Public Administration Research and Theory, Vol. 16/1, pp. 45-65.

[18] McGuinn, P. (2005), “The national schoolmarm: No Child Left Behind and the new educational federalism”, Publius: The Journal of Federalism, Vol. 35/1, pp. 41-68.

[9] National Center for Education Statistics (2018), Digest of Education Statistics-Most Current Digest Tables, U.S. Department of Education - National Center for Education Statistics, Washington, DC, https://nces.ed.gov/programs/digest/current_tables.asp (accessed on 9 February 2020).

[1] OECD (2016), PISA 2015 Results (Volume I): Excellence and Equity in Education, PISA, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264266490-en.

[15] Riley, J. (2019), The Commissioner’s report to the Board: Our way forward for Massachusetts K-12 public education, Department of Elementary and Secondary Education, Malden.

[2] Rowe, C. (2016), Massachusetts is a lot like us, so why are its schools so much better?, The Seattle Times, Seattle.

[8] U.S. News (2020), Best High Schools in the U.S., https://www.usnews.com/education/best-high-schools (accessed on 9 February 2020).

[11] US Census Bureau (2018), 2017 public elementary-secondary education finance data, US Census Bureau, Suitland, MD.

[19] US Department of Education (2009), Application for grants under the Statewide Longitudinal Data System Recovery Act grants: PR/award R384A100044, US Census Bureau, Suitland, MD.

[14] Vazins, J. and M. Stout (2019), After Years of Debate, Top Mass. Lawmakers Unveil School Funding Plan, Boston Globe, Boston.

Note

← 1. The exception is the state of Hawaii, which only has one school district.

Metadata, Legal and Rights

This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Extracts from publications may be subject to additional disclaimers, which are set out in the complete version of the publication, available at the link provided.

https://doi.org/10.1787/3a4bb6ea-en

© OECD 2020

The use of this work, whether digital or print, is governed by the Terms and Conditions to be found at http://www.oecd.org/termsandconditions.