12. Assessing the generic skills of undergraduate students in Finland

Jani Ursin
University of Jyväskylä
Finland
Heidi Hyytinen
University of Helsinki
Finland

This chapter is based on the final report of the KAPPAS! project. For more information, please see Ursin, J., H. Hyytinen and K. Silvennoinen (eds.) (2021), Assessment of Undergraduates’ Generic Skills in Finland – Findings of the Kappas! Project. Ministry of Education and Culture. Publications of Ministry of Education and Culture 2021:31.

The Finnish education system can be seen as having two main streams, i.e. a general stream (which provides students with general knowledge, information and skills) and a vocational stream (which provides students with vocational and professional competences). The system is flexible, allowing easy movement between the streams during one’s educational career, and few students come to a dead-end. The higher education (HE) system also reflects the two streams, consisting as it does of two complementary sectors: one with 24 professionally-oriented universities of applied sciences (UASs), and the other with 14 research-intensive universities. The UASs train professionals in response to labour market needs and carry out research, innovation and development activities. For their part, the research-intensive universities conduct scientific research, and provide instruction and postgraduate education based on that research. Both types of higher education institutions (HEIs) enjoy autonomy, with freedom to conduct education and research. This freedom is secured in the Finnish Constitution and guaranteed by the laws governing HEIs (Ammattikorkeakoululaki, 2014[1]; Universities Act, 2009[2]). Nonetheless, as part of the government, the Ministry of Education and Culture (MoEC) allocates core funding to the HEIs and steers the activities of institutions via a process of management by results (Ursin, Hyytinen and Sivennoinen, 2019[3]).

Currently (as of 2021), Finland has 38 HEIs with around 300 000 students (Vipunen – opetushallinnon tilastopalvelu, 2019[4]). Each higher education institution decides on the students to be admitted, and on the criteria for admission. There are three ways of admitting students to the HEIs: entrance examinations, grades in the Matriculation Examination or in vocational upper secondary qualifications, and the Open University route. The first two are the principle forms of student admission. Recently, the admission of higher education students was reformed in such a way as to place more weight on the grades. This means that from 2020 onwards more than half of student places have been filled on the basis of grades, thereby highlighting the importance of success in upper secondary education and especially in the Matriculation Examination. Higher education is free for all students with the exception of those coming from countries that are not members of the European Union or the European Economic Area. Universities offer bachelor's degrees (3-year programmes), master's degrees (2-year programmes) and third-cycle postgraduate degrees (3-year programmes); by contrast, the UASs provide bachelor’s degrees (lasting 3-4 years) and master's degrees (lasting 1.5-3 years). In line with the aims of the Bologna process – reflecting the political will of 49 countries to build a European Higher Education Area – a competence-based approach to curriculum development has gained ground in Finnish higher education (Gaebel et al., 2018[5]).

Over the past two decades, there has been active debate on the skills and competences that higher education should promote. In Europe this discussion has touched on ways of improving the quality and transparency of higher education (e.g Ursin (2014[6]))). The European Union has, for its part, emphasised that changes in working life require a broad range of competences from higher education graduates. Taking an intergovernmental perspective, the OECD has stressed the importance of learning outcomes in higher education via a feasibility study on the Assessment of Learning Outcomes in Higher Education (AHELO), conducted in 2010-2013 (Tremblay, Lalancette and Roseveare, 2012[7]). In line with this, generic skills, such as problem solving, communication and co-operation (in addition to professional skills) are considered to be important in working life, continuous learning and digitalisation (e.g. (European Commission, 2013[8])). To take an example, in the era of artificial intelligence, generic skills can be seen as the element that sets humans apart from machines, with technical operations being increasingly left to machines, and with humans focusing on processes that require creativity and originality (European Commission Education and Training, 2019[9]).

In Finland, the government views the competences produced by higher education as crucial for the success of Finland in global education and the labour markets. The government has set as an ideal the objective that Finland should be the most competent country in the world, with higher education producing the best learning and learning environments in the world. Furthermore, the skills generated by higher education are seen as pivotal in Finland in terms of responding to changes in the labour market, and from the point of view of continuous learning. Indeed, for a small country like Finland, higher education is seen as the key to operating and influencing on a global scale (Ministry of Education and Culture, 2017[10]).

Traditionally, higher education policy in Finland has aspired to an equal level of quality across the system (Välimaa, 2004[11]). However, assessments of the quality of teaching and learning in higher education are by no means straightforward. In fact, Finland has up to now lacked information on what students learn during their studies, and on how well the higher education policy – intended to promote equality – actually produces or enhances high quality teaching and learning.

The expertise of undergraduate students can be seen as composed of (1) field-specific knowledge and skills and (2) generic skills. In higher education, the most crucial generic skills are higher-order cognitive skills such as critical thinking, argumentation and analytical reasoning (e.g. Tsaparlis (2020[12]), Arum and Roksa (2011[13]), Lemons and Lemons (2013[14])). Nonetheless, previous studies, both internationally and in Finland, have indicated that undergraduate students face challenges, for example, in argumentation, interpreting and evaluating information, and drawing conclusions (e.g. (Badcock, Pattison and Harris (2010[15]), Arum and Roksa (2011[13]), Evens, Verburgh and Elen (2013[16]), Hyytinen et al. (2015[17])).

The concerns and political aspirations presented above paved the way for the project entitled Assessment of Undergraduate Students’ Learning Outcomes in Finland (Finnish acronym KAPPAS!). Its main aims were (1) to identify the level of Finnish undergraduate students’ generic skills, and (2) to determine the factors associated with the level of generic skills. The project was funded by Ministry of Education and Culture (MoEC) and carried out by Jyväskylä University Institute for Educational Research (FIER), together with Helsinki University Centre for University Teaching and Learning (HYPE). The Council for Aid to Education (CAE) participated in the project as an international partner. Altogether, seven (out of 24) Finnish UASs and 11 (out of 14) universities participated in the study.

The test instruments, testing platforms, proctor interfaces and test manuals were translated into the two official languages of Finland, i.e. Finnish and (Finnish) Swedish. The translation and adaptation process of the test instruments followed the International Translation Committee guidelines for translating and adapting tests (Bartram et al., 2017[18]); hence it consisted of four phases. In the first phase, the international partners translated the test instruments from English into Finnish and Swedish. Next, two translators (who had knowledge of English-speaking cultures, but whose native language was the primary language of the target culture) in Finland independently checked and confirmed the translations. Thereafter, the national research team reconciled and verified the revisions. After the Finnish translations were checked, the Swedish translations were verified against the Finnish translations, in order to ensure the equivalence of the translations. Finally, the translations were pre-tested in cognitive labs. Each cognitive lab lasted around two hours, and involved think-aloud protocols and interviews among 20 Finnish undergraduate students. The cognitive labs made it possible to check that the translation and adaptation process for both the Finnish and Swedish versions had not altered the meaning or the difficulty of the tasks, and that the instruments tapped into the cognitive processes as expected (Hyytinen et al., 2021[19]; Leighton, 2017[20]). The test instruments and testing platforms were fine-tuned on the basis of the cognitive labs; hence a few consistency issues between the Finnish and Swedish translations were addressed.

The target group of the KAPPAS! project consisted of students at the initial and final stages of their undergraduate degree programmes, attending 18 participating Finnish HEIs. The aim of the sampling was to obtain as representative a sample as possible, based on the field of study, and with coverage of the entire country. The starting point of the sampling frame was to select 200 initial- and 200 final-stage students from each participating HEI. Previous experience – including the data collection from the AHELO feasibility study (Tremblay, Lalancette and Roseveare, 2012[7]; Ursin, 2020[21]) – had shown that it is difficult to motivate higher education students to participate in such studies, and that random sampling of individual students has a poor participation rate. For this reason, the research team decided to carry out data sampling via a cluster sampling method. The sampling was carried out within each HEI in such a way that the fields of study were randomly selected. Thereafter, a cluster (such as a tutor or seminar group) within the required fields of study was randomly selected so that the desired number of students was sampled. Overall, the data sampling aimed for the best possible representativeness at the national level rather than at the institution level.

A translated and culturally adapted version of the CLA+ International was applied. It included a performance task (PT), a set of 25 selected-response questions (SRQs), and a set of 37 background information questions. It was administered online via a secure testing platform during an assigned testing window (from August 2019 to March 2020). Each test session lasted for 2 hours 15 minutes. Students had 60 minutes to complete the PT, followed by 30 minutes for the SRQs. Thereafter, students filled in a background survey. The PT measured analysis and problem solving, writing effectiveness and writing mechanics. In order to successfully complete the PT, students needed to familiarise themselves with the materials available in an electronic document library and then write an answer to the question, which dealt with the differences in life expectancies in two cities. The SRQs measured critical reading and evaluation, scientific and quantitative reasoning, and critiquing an argument. The SRQs in each section were based on one or more documents. The materials and questions in the sections covered the topics of brain protein, nanotechnology and women in combat.

Each HEI was responsible for (1) inviting students, and (2) administering and proctoring the computer-based tests according to the instructions and manuals provided by the national research team and CAE. The research team also offered training for the contact persons in all the HEIs and support for the proctors. Participation in the KAPPAS! project was voluntary, and informed consent was obtained from the participants. In some institutions, the participating students received small non-monetary incentives such as movie tickets.

Altogether, 2 402 undergraduate students participated in the study. Of these, 1 538 (64%) were initial-stage (first-year) students and 864 (36%) final-stage (third-year) students. The participants consisted of 1 273 (53%) university students and 1 129 (47%) UAS students, comprising in total 1 178 (49%) males and 1 158 (48%) females. The majority of the students took the test in Finnish, with only 156 (6.5%) completing the test in Swedish. The participation rate was 25%. The rate varied between initial-stage and final-stage students, types of HEIs and fields of study. The participation rate was highest among the initial-stage UAS students (39%), and lowest among the final-stage university students (15%).

In order to prepare the data for analysis, each PT response had to be scored by two independent and trained scorers on the basis of the CLA+ scoring rubric. The scoring rubric included three sub-scores relating to analysis and problem solving, writing effectiveness and writing mechanics. Each aspect was scored on a six-point scale with an additional option for responses that could not be scored; for instance, an empty response. In order to secure consistency of scoring, calibration papers were used, and scoring was monitored by a lead scorer.

The approach to the statistical analyses (descriptive statistics, and linear and logistic regression) was design-based, utilising survey weights, and accounting for clustered data. Partly due to the sampling design, and partly due to non-response, there was considerable variation in the inclusion probabilities between student sub-groups (as defined by gender, field of study, institution and study programme). The distortions in the eventual sample data were corrected by using survey weights derived from the Finnish student registers. Because the individuals in a specific study programme tended to be correlated, all variance estimates and resulting confidence intervals and significance tests were computed via methods taking this intra-cluster correlation into account. Furthermore, for all the tasks, a commensurate level of difficulty was determined via item analysis. The midpoint of the PT scale was approximately 990 points, while its range was approximately 510-1 470 points. The midpoint of the scale for the SRQs was approximately 1 090 points and the range around 550-1 630 points. The midpoint of the student's total score scale was 1 040 points and the range 530-1 550 points.

All students who participated in the study received a report on their test scores plus support material, which allowed them to enhance their generic skills. Each HEI also received a report on their students’ test performance. The national research team organised several tailored webinars for HEIs to discuss the KAPPAS! results and consider how generic skills could be better integrated into teaching and learning practices in a given HEI.

This section examines Finnish higher education students’ mastery of generic skills by the mean scores and level of mastery in the entire dataset, and further by the stage of studies and higher education sector. The CLA+ mean score for the data as a whole was 1 075. The scores in the PT and in the SRQ section were both very close to this figure (Figure 12.1). Nonetheless, there were some variations between student groups. The differences between initial- and final-stage students in total scores and in PT scores were statistically significant; in other words, final-stage students’ generic skills were at a higher level than those of initial-stage students. On examining the mean scores of the university and UAS students, it was found that the final-stage university students achieved the highest scores. Their difference from any other student group was especially noticeable in the PT segment in which the final-stage university students scored 45 points higher than the initial-stage university students and as much as 96 points higher than the final-stage UAS students. All these differences were statistically highly significant. Depending on the score under consideration (PT, SRQ or total), the higher education sector explained 5-9% of the variance in the mean scores while the stage of studies explained only around 2% of the variance.

In terms of mastery levels, for almost 60% of the Finnish undergraduate students the generic skills were at a Basic or lower level. For the remainder (about 40%) the skills were at a Proficient or higher level (Table 12.1). Very few students reached the highest (Advanced) mastery level. There was a clear difference between the higher education sectors, with 24% of the UAS students exhibiting the lowest mastery level (Below Basic), whereas among the university students only 7% fell into this category. At the same time 23% of the university students reached the two highest mastery levels, with only 5% of the UAS students achieving these levels. The difference was even more striking when the stage of studies was taken into account. Thus, 29% of the initial-stage UAS students fell below a Basic level of mastery, with only 8% of initial-stage university students falling into this category. Regarding final-stage studies, 28% of the university students reached at least an Accomplished level of mastery, while the corresponding figure for the final stage UAS students was 7%.

The associations with the level of generic skills were investigated with respect to the field of study, age, gender, educational background, socio-economic background and attitude towards the test. As age showed no systematic association with the mastery of generic skills, it will not be further discussed in this chapter.

Depending on the mean score observed, gender seemed to have a systematic association with the CLA+ mean scores (Figure 12.2). In the PT component the female students scored significantly higher than the males. In the SRQs the result was the opposite, with male students outperforming females. Although the field of study per se had no systematic association with the level of generic skills, an association was observed via gender; hence, the best overall PT performance occurred in the fields dominated by women (such as the humanities), while the best SRQ performance occurred in the male-dominated fields (such as engineering). In other words, in the female-dominated fields the skills involving analysis and problem solving, writing effectiveness and writing mechanics were at a higher level than in the male-dominated fields. The male-dominated fields outperformed the female-prevailed fields in the skills measured by the SRQs, i.e. those involving critical reading and evaluation, scientific and quantitative reasoning, and the ability to critique an argument. However, gender only explained around 1% of the variance in the mean scores for the fields analysed.

The information on educational background consisted of whether or not the participant had taken the Matriculation Examination, how the participant had succeeded in the mother tongue test included with it and whether the participant had a previous degree or qualification. The Finnish Matriculation Examination is a national examination generally taken at the end of the Finnish upper secondary school. The examination consists of a minimum of four tests. One of these, i.e. the test in the candidate’s mother tongue, is compulsory for all candidates. The grades in the Matriculation Examination are (from highest to lowest): laudatur (L), eximia cum laude approbatur (E), magna cum laude approbatur (M), cum laude approbatur (C), lubenter approbatur (B), approbatur (A) and improbatur (I, indicating failure in the test).

The proportion of students who had completed the Matriculation Examination in the dataset was 80%. The figure was higher for the university students, with 92% of the university students (as opposed to 66% of the UAS students) having completed the Matriculation Examination. The CLA+ mean scores of students who had completed the Matriculation Examination were on average 84 points higher than for those students who had not done so (Figure 12.3). This difference was statistically highly significant. Indeed, some of the differences between the university and the UAS students can be explained by the larger proportion of persons who had completed the Matriculation Examination among the university students. The Matriculation Examination explained 5-9% of the variance of the mean scores.

The level of generic skills was strongly associated with the mother tongue skills exhibited in the Matriculation Examination; thus the CLA+ mean scores rose almost linearly with the mother tongue grades, and the differences between the groups were statistically significant (Figure 12.4). Those who failed in the mother tongue test or were non-matriculated showed the lowest level of generic skills, whereas those who had had either of the two highest grades in the mother tongue test had the highest results. The mother tongue grade in the Matriculation Examination was in fact the strongest individual factor explaining the variance in the mean scores (i.e., 11-22% of the variance).

There was a clear difference between university and UAS students in the previous qualifications they had obtained; thus, 76% of the university students and 45% of the UAS students had completed only the Matriculation Examination, whereas for 27% of the UAS students, as opposed to just 2% of the university students, a vocational qualification was the only qualification obtained. Those who had completed both a vocational qualification and the Matriculation Examination accounted for 16% of the UAS students and 7% of the university students. Among the UAS students, 7% already had another higher education degree; the corresponding proportion among the university students was 10%.

The students who had already completed a previous higher education degree performed highest in the CLA+ test; nevertheless, the difference between these students and those who had attained only the Matriculation Examination was not statistically significant (Figure 12.5). Students with only a vocational qualification had the lowest scores.

Parental education and the estimated number of books in the student's childhood home were taken to describe the student's socio-economic background. From the data on parental education one can observe that university students’ parents are more likely to have a high level of education than the parents of UAS students. In the present dataset, 43% of the university students’ parents had at least a master's degree whereas the corresponding proportion for the UAS students was 20%. Conversely, 47% of the UAS students’ parents had not gone beyond a secondary level qualification. The corresponding figure among the university students’ parents was 27%.

Students whose parents had attained, at most, a basic education showed the clearest distinction from other groups of students, with their mean scores emerging as significantly lower than those of other groups (Figure 12.6). The differences between the other groups were fairly small, with only a few differences in PT or total mean scores showing statistical significance. It was notable that overall the parental education level explained only a small part of the variance (1-4%) in the level of generic skills.

The number of books in the student's childhood home can be used as an indicator of the reading and learning culture associated with the student’s home background. The data indicated that university students had on average a higher number of books in their childhood homes than the UAS students. The mean scores in the CLA+ test (in the PT, the SRQ and in total) improved linearly with the increasing number of books in the student’s childhood home (Figure 12.7). This positive connection was statistically highly significant for the total scores, the PT scores and the SRQ scores. The number of books in the childhood home explained 5-8% of the variance in the mean scores.

The students’ attitudes towards the test were explored by asking how engaging they found the tasks included in the test, and how much effort they put into completing the tasks. The distribution of student interest was fairly symmetrical in the data; the majority of students found the test engaging or moderately engaging. University students were more likely to find the test engaging than UAS students. There were no significant differences between initial-stage and final-stage university or UAS students.

Some four out of five students said they had made a lot of effort or applied their best effort in completing the CLA+ International test. One out of three university students reported that they had applied their best effort in the test. Such a major effort was more common among final-stage than initial-stage students, both in the universities and the UASs. Final-stage university students were the most likely to apply their best effort (39%) whereas initial-stage UAS students were the least likely to do so (11%). The proportion of students who said they made little or no effort was only 2% in the data.

Both engagement and effort in the test had a linear and statistically highly significant association with the test results: the more effort a student had applied in completing the test and the more engaged a student found the test, the higher were the results achieved (Figure 12.8 and Figure 12.9). Engagement in the test explained 4-8% of the variance, while effort explained 4-9% of the variance.

Multiple regression analyses were conducted to examine the relations between the CLA+ scores and background variables. For both university and UAS students, the most significant explanatory variables for all the CLA+ scores were (1) the student’s mother tongue grade in the Matriculation Examination, and (2) the amount of effort the student had applied in taking the test (Table 12.2). The number of books in the childhood home explained the variation in CLA+ scores statistically significantly in all but one case (PT for university students), indicating that students who grew up with books at home tended to have a higher level of generic skills. The roles of the other background variables tested varied between the scores. In particular, differences between the fields of study and gender differences often lost their statistical significance when the other background variables were controlled. The regression model coefficients of determination were higher for the PT than for the SRQs, and higher for UAS students than for university students. It was notable that the differences in the UAS students’ PT scores could be explained fairly precisely by background factors.

On the basis of the study, four policy recommendations can be made. Firstly, more attention should be paid to the learning of generic skills at the lower educational levels, and also in learning environments outside the school. While the KAPPAS! project focused on undergraduate students’ generic skills, the findings show that an important foundation for the development of generic skills is laid in prior education. This finding is in line with previous studies, where it has been found that generic skills are an important predictor of academic achievement and adaptation to higher education (e.g. Arum and Roksa (2011[13]), van der Zanden et al. (2018[23])). Thus, the results indicate a need to emphasise generic skills already at pre-tertiary level (e.g. at secondary level), especially in vocational education and training. Furthermore, the results demonstrate that the scholarly culture of the childhood home is an important predictor of students’ generic skills. In this sense, the evidence from this study highlights the importance of reading, and of encouraging reading from a very young age (cf. Leino et al. (2019[24]), Kleemola, Hyytinen and Toom (forthcoming[25])). This could involve paying attention to more than just the scholarly circumstances in the childhood home, and considering how a range of focal learning environments outside the school might be put in place.

Secondly, the role of generic skills in student admission should be explored. In 2020, the admissions procedures in Finnish higher education were reformed; thus, after some political debate, the emphasis in student admission moved away from one-off “high stakes” entrance examinations towards diploma-based admissions involving a focus on National Matriculation Examination grades (Kleemola and Hyytinen, 2019[26]; Kleemola, Hyytinen and Toom, forthcoming[25]). The findings of this project support diploma-based admissions in student selection, insofar as the Matriculation Examination mother tongue grades in particular were a good predictor of the mastery of generic skills. Nevertheless, there are good reasons not to give up entrance examinations entirely. In line with the elementary principle of equal opportunities in Finnish HE, a transition to higher education must be secured for those eligible applicants who have not completed the Matriculation Examination. Entrance examinations of a more generic nature – involving something similar to the CLA+ International – have been proposed as a solution (Talman, 2018[27]). However, before undertaking any comprehensive renewal of the entrance examination, more research on the predictive value of prior generic skills will be needed. To gain more insights into this issue, future research should, for example, focus on testing a wider range of the generic skills that may be relevant in preparedness for higher education (Kleemola, Hyytinen and Toom, forthcoming[25]).

Thirdly, generic skills need to be developed in line with the aims of UAS and university education. The findings of the present study emphasised the differences between university and UAS students’ mastery of generic skills, with university students exhibiting more versatile and superior generic skills than UAS students, as measured by CLA+ International. From the HE policy perspective, this observation can be accounted for by the differing missions and student profiles possessed by those HE sectors. In efforts to develop generic skills, it will be important (1) to recognise that the students in UASs and universities display different kinds of critical thinking, argumentation, analytical reasoning and written communication skills, and (2) to consider the consequences of this in terms of ways of supporting students’ learning throughout their study path. The skills in question are considered crucial for becoming a genuinely autonomous and participating citizen of the 21st century. Moreover, these skills have been found to be essential for progressing successfully through higher education, and in the transition to working life (Arum and Roksa, 2011[13]; Tuononen, Parpala and Lindblom-Ylänne, 2017[28]; Tuononen, Parpala and Lindblom-Ylänne, 2019[29]).

Finally, the findings indicate that the level of generic skills is surprisingly low, especially for certain groups of students in a country that strives for equality across its HE system. The results in this regard foreground the teaching and learning practices of Finnish HEIs as a matter for debate. There are several means by which HEIs could better promote the teaching and learning of generic skills. For example, in seeking to support the development of generic skills, more attention should be paid to the coherence of the curriculum and to the systematic integration of generic skills throughout students’ studies (Hyytinen, Toom and Shavelson, 2019[30]). Generic skills need to be systematically practised in multiple contexts, and within various tasks, combining theory and practice throughout students’ higher education studies (Virtanen and Tynjälä, 2018[31]). This means that the learning of generic skills should be expressed within the curriculum and be systematically taken into account in terms of intended learning outcomes, teaching methods, assignments and assessments (Hyytinen, Toom and Shavelson, 2019[30]). Successful integration at the curriculum level involves collaboration between teachers and persons who make decisions on the curriculum. Moreover, higher education teachers need to have a clear understanding of what generic skills actually consist of, and why the skills should be taught. They further need pedagogical competencies that will enable them to integrate the elements of generic skills within their teaching practices.

What, then, were the main lessons from the project? When the participating HEIs in the KAPPAS! project were asked about the usability of the assessment results, many HEIs indicated their intention to use the findings to improve their teaching and learning. The HEIs also found that the project made generic skills more visible in their institutions, thus sparking discussion on the role of generic skills in teaching, and paving the way for the development of more working life-oriented curricula in their study programmes. Hopes were also expressed that the institution-specific findings would be used as part of the teachers’ pedagogical training, and ideally also in student guidance, encompassing personal study plans and career guidance. Nonetheless, for the HEIs, the most challenging aspects of the assessment were bound up with the labour-intensive implementation of the test (including the time-consuming student recruitment) and the limitations of the institution-specific findings in cases where the number of participating students remained low. Furthermore, given that in the KAPPAS! project only a handful of the participating students were interviewed on how they could utilise their individual test results, it would be important in future to carry out such enquiry on a larger scale; for example, as part of the actual test.

Although the project team aimed to maintain a high scientific standard in conducting the study, some unavoidable challenges emerged. The difficulties in student recruitment meant that the participation rate remained fairly low although this could to some extent be taken into account in the analyses. Furthermore, there were some reliability issues related to the test instrument, in terms of the limited number of tasks employed. A further point to bear in mind was that the cross-sectional study design did not allow a reliable investigation of the ways in which generic skills actually develop – an issue that is of crucial importance and interest to Finnish HEIs and the Finnish government.

In order to examine the development of generic skills, there is a need for a longitudinal inquiry, within which the same students would be followed from the initial to the final stage of their undergraduate studies. For the HE system as a whole, such follow-up information could indicate whether Finnish higher education is indeed on the way to producing the best learning in the world, as set out by the government in its policy goals. For the HEIs a follow-up study would give students more reliable information on the added-value of their education. In the future, it will also be important to better acknowledge students as cognizant individuals, and to ensure that the assessment actually helps the student to become a better learner. It is therefore important that the assessment of generic skills should provide information that truly supports students in developing their generic skills. Last but not least, given that in Finland standardised testing in education has not been widely adopted, it is crucial that the assessment does not become a “high stakes” once-and-for-all exercise; rather, it should serve the purpose of enhancement-led assessment – a principle that has up to now been paramount in the assessment culture of Finnish higher education. One can anticipate that continued adherence to this principle will promote the assessment of generic skills in Finnish higher education in the years to come.

References

[1] Ammattikorkeakoululaki (2014), Polytechnic Act (932/2014), https://www.finlex.fi/fi/laki/smur/2014/20140932 (accessed on 23 April 2021).

[13] Arum, R. and J. Roksa (2011), Academically Adrift: Limited Learning on College Campuses, University of Chicago Press, Chicago.

[15] Badcock, P., P. Pattison and K. Harris (2010), “Developing generic skills through university study: a study of arts, science and engineering in Australia”, Higher Education, Vol. 60/4, pp. 441-458, https://doi.org/10.1007/s10734-010-9308-8.

[18] Bartram, D. et al. (2017), “ITC Guidelines for Translating and Adapting Tests (Second Edition)”, International Journal of Testing, Vol. 18/2, pp. 101-134, https://doi.org/10.1080/15305058.2017.1398166.

[14] DeHaan, R. (ed.) (2013), “Questions for Assessing Higher-Order Cognitive Skills: It’s Not Just Bloom’s”, CBE—Life Sciences Education, Vol. 12/1, pp. 47-58, https://doi.org/10.1187/cbe.12-03-0024.

[8] European Commission (2013), High Level Group on the Modernisation of Higher Education.

[9] European Commission Education and Training (2019), Key competences for lifelong learning - Publications Office of the EU, Publications Office of the European Union.

[16] Evens, M., A. Verburgh and J. Elen (2013), “Critical Thinking in College Freshmen: The Impact of Secondary and Higher Education”, International Journal of Higher Education, Vol. 2/3, https://doi.org/10.5430/ijhe.v2n3p139.

[5] Gaebel, M. et al. (2018), Trends 2018: Learning and teaching in the European Higher Education Area.

[17] Hyytinen, H. et al. (2015), “Problematising the equivalence of the test results of performance-based critical thinking tests for undergraduate students”, Studies in Educational Evaluation, Vol. 44, pp. 1-8, https://doi.org/10.1016/j.stueduc.2014.11.001.

[30] Hyytinen, H., A. Toom and R. Shavelson (2019), “Enhancing Scientific Thinking Through the Development of Critical Thinking in Higher Education”, in Redefining Scientific Thinking for Higher Education, Springer International Publishing, Cham, https://doi.org/10.1007/978-3-030-24215-2_3.

[19] Hyytinen, H. et al. (2021), “The dynamic relationship between response processes and self-regulation in critical thinking assessments”, Studies in Educational Evaluation, Vol. 71, p. 101090, https://doi.org/10.1016/j.stueduc.2021.101090.

[26] Kleemola, K. and H. Hyytinen (2019), “Exploring the Relationship between Law Students’ Prior Performance and Academic Achievement at University”, Education Sciences, Vol. 9/3, p. 236, https://doi.org/10.3390/educsci9030236.

[25] Kleemola, K., H. Hyytinen and A. Toom (forthcoming), Critical thinking and writing in transition to higher education in Finland: does prior academic performance and socioeconomic background matter?, Submitted for publication.

[28] Kyndt, E. et al. (eds.) (2017), The transition from university to working life - An exploration of graduates perceptions of their academic competences, Taylor & Francis, Routledge, http://hdl.handle.net/10138/308628.

[20] Leighton, J. (2017), Using Think-Aloud Interviews and Cognitive Labs in Educational Research, Oxford University Press, Oxford, https://doi.org/10.1093/acprof:oso/9780199372904.001.0001.

[24] Leino, K. et al. (2019), PISA 18 : ensituloksia. Suomi parhaiden joukossa [First results of PISA 18], Ministry of Edncation and Culture, ISBN: 978-952-263-678-2, https://julkaisut.valtioneuvosto.fi/handle/10024/161919 (accessed on 1 August 2022).

[10] Ministry of Education and Culture (2017), Working together for the world’s best education. Policies on promoting internationality, Ministry of Education and Culture.

[27] Talman, K. (2018), Ammattikorkeakoulujen uuden digitaalisen valintakokeen kehittäminen – määrittelyvaiheen tulokset: Tutkimusraportti. [Development of a New Digital Entrance Examination for Universities of Applied Sciences – Results of the Definition Phase: Research Report, Metropolia University of Applied Sciences, ISBN:978-952-328-119-6, https://www.theseus.fi/handle/10024/154646 (accessed on 1 August 2022).

[7] Tremblay, K., D. Lalancette and D. Roseveare (2012), “Assessment of Higher Education Learning Outcomes (AHELO) Feasibility Study”, Feasibility study report, Vol. 1, https://www.oecd.org/education/skills-beyond-school/AHELOFSReportVolume1.pdf (accessed on 1 August 2022).

[12] Tsaparlis, G. (2020), “HIGHER AND LOWER-ORDER THINKING SKILLS: THE CASE OF CHEMISTRY REVISITED”, Journal of Baltic Science Education, Vol. 19/3, pp. 467-483, https://doi.org/10.33225/jbse/20.19.467.

[29] Tuononen, T., A. Parpala and S. Lindblom-Ylänne (2019), Graduates’ Evaluations of Usefulness of University Education, and Early Career Success – a Longitudinal Study of the Transition to Working Life, Assessment & Evaluation in Higher Education.

[2] Universities Act (2009), FINLEX, The Ministry of Justice, http://www.finlex.fi/en/laki/kaannokset/2009/en20090558 (accessed on 23 April 2021).

[21] Ursin, J. (2020), Assessment of Higher Education Learning Outcomes Feasibility Study, SAGE Publications, Inc., 2455 Teller Road, Thousand Oaks, California 91320 , https://doi.org/10.4135/9781529714395.n52.

[6] Ursin, J. (2014), “Learning outcomes in Finnish higher education from the perspective of comprehensive curriculum framework”, in Coates, H. (ed.), Higher Education Learning Outcomes Assessment, Peter Lang, https://doi.org/10.3726/978-3-653-04632-8/20.

[22] Ursin, J. et al. (eds.) (2021), Assessment of undergraduate students’ generic skills in Finland: Finding of the Kappas! Project (Report No. 2021: 31), Finnish Ministry of Education and Culture, http://urn.fi.

[3] Ursin, J., H. Hyytinen and K. Sivennoinen (2019), “Higher Education Reforms in Finland”, in Broucker, B. et al. (eds.), Higher Education System Reform, BRILL, https://doi.org/10.1163/9789004400115_005.

[11] Välimaa, J. (2004), “Nationalisation, Localisation and Globalisation in Finnish Higher Education”, Higher Education, Vol. 48/1, pp. 27-54, https://doi.org/10.1023/b:high.0000033769.69765.4a.

[23] van der Zanden, P. et al. (2018), “Patterns of success: first-year student success in multiple domains”, Studies in Higher Education, Vol. 44/11, pp. 2081-2095, https://doi.org/10.1080/03075079.2018.1493097.

[4] Vipunen – opetushallinnon tilastopalvelu (2019), Korkeakoulujen opiskelijat. Opetushallinnon ja Tilastokeskuksen tietopalvelusopimuksen aineisto 2.8, https://vipunen.fi/fi-fi/_layouts/15/xlviewer.aspx?id=/fi-fi/Raportit/Korkeakoulutuksen%20opiskelijat_A1.xlsb.

[31] Virtanen, A. and P. Tynjälä (2018), “Factors explaining the learning of generic skills: a study of university students’ experiences”, Teaching in Higher Education, Vol. 24/7, pp. 880-894, https://doi.org/10.1080/13562517.2018.1515195 (accessed on 23 April 2021).

Metadata, Legal and Rights

This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Extracts from publications may be subject to additional disclaimers, which are set out in the complete version of the publication, available at the link provided.

© OECD 2022

The use of this work, whether digital or print, is governed by the Terms and Conditions to be found at https://www.oecd.org/termsandconditions.