6. Survey implementation: a brief overview

Koen Van Lieshout
OECD
Anthony Arundel
University of Maastricht
Netherlands
Stéphan Vincent-Lancrin
OECD

This final chapter provides a brief overview of how to implement the three types of questionnaires prepared for this report (chapters 3, 4, 5). National surveys are likely to be implemented by experts from National/Regional Statistical Offices, who are probably familiar with the issues and concepts discussed in this chapter. The chapter is thus mainly designed to be of value for the leaders of educational institutions who wish to implement one of the three questionnaires in their own institution. Some details are also provided for larger-scale surveys at the regional or district level. The chapter outlines best survey practices and provides guidance on logistical challenges and advice on avoiding pitfalls. Additional details on how to run a survey are available in Chapter 9 of the Oslo Manual (OECD/Eurostat, 2018[1]) as well as in other sources, for example (Fink, 2003[2]).

Most commonly, surveys have a statistical purpose. They try to provide statistically representative data about a given phenomenon (here, innovation) that accurately represent its manifestations for a specific population of interest. This requires either a census or a random sample and a protocol to maintain representativeness and maximise response rates. The implementation of the main innovation questionnaire (chapter 3) at the regional or district level is likely to require representative data.

Statistical surveys will typically be organised by the region or district, with the goal to obtain information from representative samples of schools, teachers, and students. The sampling method can use a random sample of schools, in combination with randomly selected strata (classrooms) within schools. If schools or classrooms have contrasted characteristics, those samples should be stratified so as to randomly represent these characteristics (for example small and large schools). If contact data are available, it is also possible to randomly sample all teachers and school leaders within a region. In either case, the sample needs to be sufficiently large to minimise standard errors or chance variations in the results.

If the questionnaires for students are used, the most practical option is to sample institutions because contact data for individual students may not be available or accessible, due to privacy regulations.

While all surveys in this report could in principle be used for statistical purposes, two of them were also designed to be of value at the institution level (innovation culture and innovation for equity), They can be used for self-reflection and data could be collected at the school level, particularly if they are followed up with a workshop or other forms of additional school level discussion. While ideally this should involve efforts to have high response rates, they can be used to start a dialogue even if they are not fully representative of all institution stakeholders.

Research surveys should typically obtain informed consent from all respondents, who should also be informed about the use and objectives of the collected data and respect regulation about the collection and storage of personal data. When carried out for operational purposes at the school level, some of these obligations (or good practices) may not apply. Some statistical surveys are mandatory, in which case they do not require informed consent (although they should still provide information about the use of the collected data).

Given the different groups involved, the survey protocol (the rules for conducting the survey) needs to ensure that all participants can fill out the questionnaire at convenient times and steps are taken to minimise non-participation and biases from participants speaking to each other about the survey before everyone has completed their questionnaires. Even with buy-in from school staff, a protocol is required to outline the incentives and methods to ensure the security and validity of the data and to minimise problems that can thwart successful data collection.

The protocol will differ depending on which groups are surveyed. The protocol will be much simpler if only school leaders and teachers are surveyed, since these two groups can provide consent themselves and it should be possible to obtain contact details for them if the survey is conducted at the regional or district level. If students are involved, questionnaires will need to be distributed within classrooms. Minor students should require the consent of a parent/guardian (and ideally their own consent too). The instructions given in this section can be used to develop the protocol for a regional/district level survey or a school-level survey.

The best practice for regional or district surveys is to send invitation letters and questionnaires to named respondents, for which contact details such as an email address for an online survey, or a postal address for a mailed survey, are available. This data permits follow-up for non-respondents and the calculation of response rates. This practice may be applicable for school leaders and teachers, but it is unlikely to be possible for students.

A basic protocol for an online or mailed survey of school leaders and teachers is as follows:

  1. 1. Send a letter of invitation that explains the purpose of the survey, the amount of time required to complete it, offers confidentiality, and describes informed consent. If the survey collects minimal personal details, such as age, gender, job position and highest level of education only, the letter can state that informed consent will be assumed if the respondent returns the completed questionnaire.

  2. 2. After approximately one week, send the questionnaire by post or an email that includes a link to the online version. The email should contain a confidential access code that is limited to the specific respondent. This is required for follow-up.

  3. 3. One to two weeks after the initial mail-out, send a one-page reminder letter to non-respondents.

  4. 4. Four to six weeks after the first reminder, send a second reminder letter. This should differ in wording from the first reminder.

  5. 5. Two weeks after the second reminder, send a third reminder letter or begin telephone reminder calls (if supported by the budget).

Permission to conduct the survey may be required from senior managers at the regional or district level if the survey will be implemented at the school level and include students. For tertiary institutions, permission may be required from the head of the institution. Some countries require ethics approval for surveys by tertiary institutions, notably if they are conducted for research purposes (rather than as part of the operation of the institution). Institutional level surveys can also cover school leaders and teachers.

Large institutions such as universities and comprehensive high schools can use random samples of teachers and students or stratified random samples, such as by course. For smaller schools it may be easier to run a census, in which all teachers and students are asked to complete the questionnaire. In both cases, reaching a good response rate is important.

A survey at the institutional level requires an identified administrator at each school (or tertiary institution) that is responsible for distributing questionnaires to school leaders, teachers, and administrative staff (if included) and to students. This requires identifying a responsible person at each institution to co-ordinate data collection, identify contacts at the school, and facilitate preparations among school staff.

The responsible administrator has four main tasks: maintenance of a datafile (usually in Excel or other spreadsheet programme) of all activities to prepare and implement the survey, teacher/invigilator trainings, consent form distribution, and the preparation of survey packets. The datafile file (confidential) should include the names, contact details, and classroom or homeroom of all students and, for each, if a consent form was obtained and if the individual completed the questionnaire. To prevent biases in responses, the protocol for school-level surveys need to ensure that conditions are the same and that invigilators and teachers receive the same instructions and explanations to carry out the surveys. Teacher/invigilator trainings are necessary if teachers are responsible for distributing questionnaires to students. In-person training for teachers should be conducted in each school approximately 1 to 3 weeks prior to data collection. Survey coordinators/facilitators will need to give teachers information about the survey administration process and parental/student consent procedures.

In many countries, consent will need to be obtained from school leaders, teachers, and students and/or parents (if students are minors). Depending on the legislation in the respective country, passive consent forms could be a strategy to limit non-participation. For instance, parents only fill out the form if they do not want their children to take the survey. In some countries, the legislation requires active consent (e.g. GDPR in the European Union). In addition to the national language, consent forms for students should be provided in the common minority languages spoken at home. Completed consent forms should be collected by the responsible administrator, who can use the information to determine the number of questionnaires and decoy booklets to be provided to each class and provide each teacher with a list of students to receive the questionnaire. Decoy booklets can be given to students who lack consent. The decoy booklets should be multipage scannable booklets with a cover and back identical to the survey but with other text instead of questions on the inside. Students who receive such booklets should be able to hold them at their desks without peers knowing if they did or did not participate, and why.

The responsible administrator will need to prepare survey packets before survey implementation. Packets for each classroom should include scripts for the invigilator, survey booklets, decoy booklets, pencils/pens and a classroom information form. These packets should be prepared using classroom-level rosters. The survey administration materials should be put in an envelope with a label attached to the top with school name, teacher name, class period, and number of students enrolled in the class. These survey packets should be delivered to teacher mailboxes (or a more logical place for teachers to get these envelopes if available) at least 1 day prior to survey administration.

On data collection day, the organising team needs to ensure teachers have necessary supplies and be ready to answer last-minute questions. Teachers or invigilators need to record information on the class-level information form, including the number of students that are registered for the class, and the number of students present during survey administration. Depending on the administrative organisation, teachers or invigilators could also fill out the number of students/parents who did not consent to participating in the survey. It may be easier to record this at a school-level and provide this information to the teacher or invigilator in advance. The person administrating the survey should distribute the surveys and read the script to students. The script should provide the context of the survey to the students and what is expected of them. Students should be reminded that the surveys are anonymous and that they should not write their names on the surveys.

While students fill out the survey, the teacher or invigilator may need to answer student questions, as some of the concepts covered in the survey may be confusing. School leaders and staff could complete the survey at the same time as students or at a separate time, for example by organising an hour after class time for all teachers to fill out the survey at once.

Surveys for school leaders, teachers and students can also be provided and completed online. Online completion removes the need for data entry after the questionnaires are completed, but requires additional administration. For students, the organising team will need to ensure that laptops or computer rooms are available for each classroom at specified times to fill out the surveys in-class. It is not advised to provide students with the option of remote completion, as this will result in much lower participation rates and prevent the participation of students without access to the internet and a computer. (Remote completion can more easily be used for school leaders and teachers, for which consent can also be obtained as they fill out the questionnaire.)

Paper questionnaires need to be collected after completion. Teachers must not look at the questionnaires or even appear to look at them. It may be preferable to arrange for one student to collect the questionnaires (face down) and place them in an envelope before giving the envelope to the class teacher. These envelopes should be collected by an administrator.

After collection, the data in paper questionnaires needs to be entered into a data capture programme that is specifically designed to look like the survey questionnaire, unless a machine-readable paper questionnaire is used. The next step is data analysis, which can use a common software programme such as Excel or dedicated statistical software such as SPSS or STATA. Data analysis needs to be done by individuals with experience with the chosen software. The basic requirement is to produce descriptive results such as frequencies (distribution of responses to each question) and cross-tabulations (distribution of responses by other variables of interest, such as student age, gender, etc.). Each question (other than questions requesting text data) must be assigned a numerical number. For instance, 1 for yes, 0 for no. Likert scales can be codes as 3 for high, 2 for moderate, and 1 for low importance. Missing data must be given a separate code from “don’t know/not relevant” (a decimal point in SPSS), as don’t know responses are interpretable survey responses. “Don’t know/not relevant” responses should be coded as equal to -9 or -99 to prevent confusion with the scale used for Likert questions.

The descriptive analyses should explore the types of practices that are the least or most common and differences in answers by specific groups of participants. Differences in perceptions or opinions between different stakeholders can be used to identify differences between the intended effects of school policy (school leaders) and the perceptions of these effects in practice (teaching staff). Other opportunities to observe disparities between intended and experienced effects can be found by comparing student survey results with the answers of teachers and school leaders for those questions that are similar or identical across the different questionnaires.

The purpose of the preliminary analyses is to establish baselines and identify areas for improvement. These are innovation activities for the main innovation questionnaire, the innovation culture for the relevant questionnaire, and equity conditions for the equity questionnaire. It is important to develop “next steps” and recommendations based on the descriptive results. This can be supported by a short report that is heavy on graphs and visuals to communicate the observations and follow-on discussions of the results by teachers, school leaders and students, such as at a school assembly, or through a workshop, as for the equity questionnaire.

For regional or district level surveys, additional insights can be gained from linking survey data to school-level information from other databases, such as school performance on a range of indicators. The survey results could also be used for additional statistical analyses such as correlation, principal component, or factor analyses of the relationships between different innovation activities and outcomes of interest. Further observations could be made by connecting school level data with data for specific innovations, legislative or policy changes at the school or at a regional or district level, or data on the demographic and socio-economic characteristics of each school. Moreover, Crohnbach alpha reliability values, ANOVA and Bonferroni adjusted comparisons may be valuable if there is a case for comparative analyses between groups can be made.

The data for the equity survey can be analysed to identify results for specific groups of students, and to identify common denominators among activities that are frequently identified as important to change or improve equity. Other comparative insights could be gained from the differences in answers between teachers and school leaders. Both surveys are similar, and as such provide opportunities to compare perceptions of where opportunities and challenges lie in improving equity. These differences in perceptions or opinions can be used to identify differences between the intended effects of school policy (school leaders) and the perceptions of these effects in practice (teaching staff). Other opportunities to observe disparities between intended and experienced effects can be found by comparing student responses with the answers of teachers and school leaders.

References

[2] Fink, A. (2003), The Survey Handbook, Sage Publications, London.

[1] OECD/Eurostat (2018), Oslo Manual 2018: Guidelines for Collecting, Reporting and Using Data on Innovation, 4th Edition, The Measurement of Scientific, Technological and Innovation Activities, OECD Publishing, Paris/Eurostat, Luxembourg, https://doi.org/10.1787/9789264304604-en.

Metadata, Legal and Rights

This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Extracts from publications may be subject to additional disclaimers, which are set out in the complete version of the publication, available at the link provided.

© OECD 2023

The use of this work, whether digital or print, is governed by the Terms and Conditions to be found at https://www.oecd.org/termsandconditions.