Foreword

Digital technologies are revolutionising many aspects of our everyday life, and skills assessments are no exception. The Survey of Adult Skills, a product of the Programme for the International Assessment of Adult Competencies (PIAAC) (hereafter referred to as “PIAAC”) was the first large-scale international assessment fully designed to be primarily delivered on a computer.

This choice of computer-based delivery was motivated by several considerations. First, an assessment of information-processing skills in the 21st century needed to test adults’ capacity to access and interpret information in digital formats. Second, assessment tasks delivered by computer can be highly interactive. This makes it possible to refine the measurement of traditional information-processing skills, as well as to develop measures of innovative testing domains, such as problem-solving in technology-rich environments or adaptive problem-solving. Finally, computer delivery provides other benefits, such as increased efficiency and improvements in data quality (e.g. automatic scoring of answers, lower loss of data and more complex test designs and in the management of survey administration).

Another important consequence of computer-based delivery is that the testing platform can record information about all the interactions between test takers and the computer. This information is stored in log-files and is also known as “process data”.

By providing a way to observe how test-takers approach and try to solve the tasks presented to them, process data have the potential to substantially enrich the information we get and, therefore, the lessons we learn from skills assessments. Process data can be used to further refine the measurement of skills traditionally assessed and to enlarge the set of indicators we obtain from assessments. They can be used to proxy unobservable traits, such as motivation and perseverance, to better understand the relationship between these attitudes and performance, and also to better interpret and contextualise the results of large-scale assessments and the differences that we normally observe across countries or socio-demographic groups.

This report offers a roadmap for readers interested in knowing more about process data and how they can be used for research purposes and to inform policy making. It describes currently available process data from PIAAC and provides examples of the analysis that can be undertaken with them.

At the same time, the report acknowledges that research on process data is still in its infancy. For the moment, log files are largely an unintended by-product of computer-based administration. Not all information that we would like to have in them has been recorded, and the available information is often cumbersome to extract and, more importantly, to interpret.

But the path before us is now clearly traced. The analysis of process data will increasingly inform the process of test development through a better understanding of the strategies and behaviour of test takers. Process data will increasingly be used to design better assessments. In an iterative process, assessments will be increasingly designed to better exploit the fact that we now have the tools to observe not only whether or not test takers are able to solve a task presented to them, but also how they arrived at the solution and where they went right and where they went wrong.

If research on process data fulfils its promise, large-scale assessments will no longer only be used as a tool to describe where OECD countries stand in terms of the skills of their adult and student populations, but also as a tool that will teach them how they can improve.

picture

Andreas Schleicher,

Director for Education and Skills and Special Advisor on Education Policy to the Secretary-General, OECD

End of the section – Back to iLibrary publication page