5. Skills and attitudes for new information landscapes

The Internet and social media have dramatically increased the amount of information that is accessible and exchanged around the world. However, quantitative increases in the amount of information available have not been accompanied by increases in the quality of such information. The information landscape is complex and difficult to navigate. Modern information systems create opportunities as they allow instantaneous access to high-quality information once only available to the few (for example, in dedicated libraries). At the same time, new information systems create new threats as they rely on technologies that allow easy and rapid access to false information or true information taken out of context. The fact that it is easy to produce and share information can also lead to information overload.

The decline of traditional news sites, such as newspapers and magazines, the increase of social media as a medium for disseminating information, the use of new generative AI models, such as ChatGPT, and the rise of deep-fake technologies all contribute to creating an information landscape that is larger, more diverse, and more complicated for individuals to filter and navigate. A study by the European Union (EU) found that deep-fake technology can have serious “malicious, deceitful and even destructive potential at an individual, organisational and societal level” either directly by spreading false information or indirectly by eroding trust in news and information on line (Huijstee et al., 2022[1]). In fact, the mere existence of deep fakes can increase distrust in information, regardless of whether it is true or false (Ternovski, Kalla and Aronow, 2021[2]), and individuals may struggle to trust any evidence because they might worry that it is a falsification (Chesney and Citron, 2018[3]). The erosion of trust in information can be observed in the results from the 2023 Edelman Trust Barometer, according to which more respondents answered that their government is “a source of false or misleading information” (46%) than answered that it is “a reliable source of trustworthy information” (39%) (Edelman, 2023[4]). Similarly, in the 2022 Edelman Trust Barometer report, 67% of respondents worried that journalists purposefully mislead by writing false, misleading, and exaggerated information, and 76% of respondents worried about the potential use of false information as a weapon (Edelman, 2022[5]). Citizen trust in the information available to them may only worsen as deep fakes and generative AI become more widely adopted and improve in the extent to which they can mimic human communication and behaviours.

A large number of adults in OECD countries worry about receiving false information on line (59% on average) or being the subject of online fraud (56% on average) (Figure 5.1). The share of adults worrying about receiving false information was highest in Italy (78%), the Republic of Türkiye (hereafter “Türkiye”) (75%) and France (74%) and lowest in Poland (36%), Latvia (35%) and Lithuania (30%). Worries about fraud, i.e. stealing bank account information, were highest among adults in Portugal (78%), Türkiye (78%) and France (75%).

The emergence of generative AI systems has the potential to increase even further the complexity of the information landscape, shaping how easily individuals can use and exchange information and how businesses and societies build economic models based on information and data exchange. Generative AI models are statistical algorithms that create new content in response to prompts. In the case of written text, generative AI models can mimic humans by predicting the most likely sequence of words given a specific stimulus (prompt) and the corpus of content used to train them (training data). Generative AI systems can produce information content – text, video or images – instantaneously and cheaply. New generative AI systems are starting to be deployed in “content farms”, i.e. websites containing AI-generated articles containing summaries of content published in traditional news outlets or false and misleading content created by the AI systems. For example, in April 2023, NewsGuard identified 49 websites in 7 languages – Chinese, Czech, English, French, Portuguese, Tagalog and Thai – were entirely or mostly generated by generative AI systems mimicking the content available in news websites, without indicating the sources of the material being published or specifying the ownership or control of the site (Sadeghi and Arvanitis, 2023[7]).

Generative AI models have the potential to significantly increase the volume of information in the information landscape, which can make it challenging for individuals to distinguish between relevant and irrelevant information. Generative AI models can also be used to create and spread false information intentionally, cheaply and quickly. Furthermore, AI systems can unintentionally produce false content in response to a specific stimulus – referred to as “hallucinating” – for example, because the quality of the training stimulus is poor, because the amount of training data is insufficient to provide an accurate output, because there are inconsistencies in the training data or because of misclassification and errors in encoding and decoding text. In both cases, generative AI systems can become powerful agents of disinformation. Moreover, because AI systems do not have an understanding of the meaning of the word sequences they produce, their output can be placed in the wrong context, thereby leading to the proliferation of misinformation.

Individuals and technologies powered by AI can lower the quality of the information landscape by:

  • deliberately propagating false information to cause harm (disinformation)

  • disseminating false information that is not intended to cause harm, often as a result of unknowing individuals sharing rumours or misleading content (misinformation)

  • sharing genuine information with the intent to cause harm, for example, leaking private information or deliberately using true information in the wrong context (malinformation) (Wardle and Derakhshan, 2017[8]).

Although technologies play an essential role in shaping the quality of the information landscape, individuals also play an important role in the quality of such a landscape. Individuals are, in fact, not only passive consumers of information – including false or misleading information – that is available on line, but they are also active agents who shape the information landscape by creating content and/or by contributing to the spread of information. Whether willingly or unwillingly, through their actions, individuals can contribute to improving or worsening the quality of the information landscape. Social media tools have made it easier than ever for citizens to share information rapidly to large numbers of “friends” and followers, to the point where disinformation has been found to spread more rapidly and widely today than actual news (Pennycook and Rand, 2019[9]; Vosoughi, Roy and Aral, 2018[10]).

Some individuals may disseminate disinformation as part of an individual identity-building process (Papapicco, Lamanna and D’Errico, 2022[11]). Others may wish to share accurate information but may not have the skills needed to evaluate the quality of such information. Others may not verify the accuracy of information content due to lack of time or cognitive fatigue arising from information overload (Pennycook et al., 2021[12]). Moreover, individuals may over-rate their abilities to distinguish false news from true information and view themselves as better than others at discerning false and true information (Corbu et al., 2020[13]). For example, a study found that 84% of individuals in the United States felt at least somewhat confident in their ability to detect fake news (Barthel, Mitchell and Holcomb, 2016[14]). However, another study found that only 17% of participants scored better than chance when trying to discern false headlines from real ones (Moravec, Minas and Dennis, 2018[15]). Similarly, Lyons and colleagues (2021[16]) found that three in four Americans overestimate their abilities to detect false headlines, and they ranked themselves on average 22 percentiles higher than their real ranking. The overconfidence in distinguishing false information from real information may, therefore, lead to individuals not only basing their actions and decisions on false or inadequate information content but inadvertently contributing to creating a low-quality information landscape by sharing false information or not putting information in the proper context. Behaviourally informed interventions can be an effective tool to reduce the spread of disinformation, as exemplified in Box 5.1.

The worsening of the information landscape may have a detrimental impact on individuals’ cognitive and behavioural processes. The overabundance of information of varying accuracy might obscure important facts and make it challenging for individuals to distinguish between credible and non-credible sources. This cognitive burden requires a heightened effort to critically evaluate and process information, which can be demanding on an individual’s finite attentional resources. Additionally, constant exposure to conflicting information can lead to confusion and mistrust, hindering the formation of informed opinions and decision-making abilities (Bawden and Robinson, 2020[18]).

Individuals’ exposure to false information could also pose a threat to society more broadly. In 2022, the World Health Organization conducted a systematic review of infodemics (an overabundance of both fake and correct information obscuring the information landscape) and health misinformation. It found that misinformation on social media had negative repercussions, leading to a rise in inaccurate interpretations of scientific information, polarisation of opinion, an increase in fear and panic, and/or a reduction in access to healthcare (Borges do Nascimento et al., 2022[19]). It also found that during times of crisis, such as the coronavirus (COVID-19) pandemic and humanitarian emergencies, social media had increasingly spread low-quality health-related information. For example, during the pandemic, there was a conspiracy that new 5G infrastructure was the cause of the COVID-19 virus. This conspiracy led to the damage and destruction of telecommunications masts in Australia, Europe, and North America, and there were several cases of verbal and/or physical abuse of engineers working with the new 5G network (Ankel, 2020[20]; Cerulus, 2020[21]; Pasley, 2020[22]).

Furthermore, scholars believe that misinformation is hindering action towards bettering the environment and fighting climate change (Benegal and Scruggs, 2018[23]). Scientists argue that “fake news” increases political polarisation and makes political action towards issues such as climate change more challenging (Tucker et al., 2018[24]).

The ability to evaluate the quality of information and the ability to seek and retrieve relevant information rest on a range of cognitive and metacognitive skills, knowledge, as well as attitudes and dispositions. Alongside the ability to process information – for example, text comprehension skills and numeracy skills, which have been extensively examined in other OECD-led reports and publications (OECD, 2019[25]; 2021[26]) – effectively operating in complex information landscapes requires: having the knowledge of how information is generated and the limitations inherent to different information generation processes and having an awareness of one’s own and other people’s cognitive limitations, i.e. having metacognitive skills.

As societies enter what some scholars have labelled the “post-truth” era (d’Ancona, 2017[27]) and a time when generative AI could have an unprecedented impact on information exchange, it is crucial to consider the skills individuals may need to navigate and make the most of an increasingly complex information landscape. At the same time, it is crucial to recognise how human cognition can shape how individuals perceive and process information and that upskilling and reskilling efforts are insufficient to ensure individuals’ safety and well-being as they navigate complex information landscapes. Understanding the type of skills and competences people need to be more resistant to the increasing amount of false information circulating on line rests on an understanding of how people create, internalise and change their belief in information.

Media literacy is widely considered a key skill to help individuals assess the quality of information they are presented with (Valverde-Berrocoso, González-Fernández and Acevedo-Borrega, 2022[28]). Individuals with higher levels of media literacy have been found to be more adept at handling misleading information in a critical manner (Jones-Jang, Mortensen and Liu, 2019[29]). The concept of “media literacy” centres around the understanding that all forms of media are created with a specific purpose, and this purpose influences how information is conveyed (Huguet et al., 2019[30]). Media literacy encompasses the skills needed to engage with media at multiple levels and includes the ability to access, analyse, evaluate and create content in various contexts (Cortesi et al., 2020[31]; Livingstone, 2003[32]; Potter, 2010[33]). In Finland, for example, media literacy is seen as a “civic competence” and is therefore embedded in the education system and policies to promote media literacy in schools have been increasingly implemented in the United States and Japan (Box 5.2).

At the same time, it has been argued that media literacy alone is not sufficient for making people resilient to false and misleading information and that a combination of several literacies, including information literacy, digital literacy, science literacy and news literacy, are needed for individuals to reliably navigate the current information environment (Jones-Jang, Mortensen and Liu, 2019[29]). In the context of discussing the skills necessary for effectively handling information in the 21st century, it is important to consider that there is an overlap among such constructs.

A predominant trend in the research on media literacy is the adoption of a functional approach, which emphasises the identification of specific competencies necessary for effective media literacy, such as the ability to evaluate and verify sources, search for information and critically analyse media messages (Edwards et al., 2021[53]). Among the competencies associated with media literacy, as well as information literacy, digital literacy, science literacy, and news literacy, is critical thinking. Critical thinking is frequently recognised as a core component and is commonly cited in literature reviews of the field (Chapman, 2016[54]; Potter, 2010[33]). For example, one systematic literature review found critical thinking skills crucial for identifying fake news (Machete and Turpin, 2020[55]). Definitions of critical thinking often emphasise logical or rational thinking, which encompasses the ability to reason, evaluate arguments and evidence and argue in a sound and cogent manner to arrive at a relevant and appropriate solution to a problem. While the concept of critical thinking encompasses multiple meanings in the literature, its essential components are widely recognised as abilities that help people analyse media messages, examine their underlying meanings and identify the motives of the sender, as well as abilities that help to evaluate media messages for accuracy, credibility, completeness and usefulness. Suspending judgement and not going with one’s immediate intuition as a part of critical thinking to deal with misinformation has been shown in research to influence truth discernment.

The critical thinking process is contingent upon several key dispositions. Vardi (2015[56]) identified three dispositions involved in critical thinking: 1) self-regulation, characterised by self-discipline and self-management; 2) an open, fair and reasonable mindset, and a readiness to confront and recognise one’s own biases and to revise one’s views as necessary; and 3) being committed to ongoing self-improvement and the acquisition of knowledge. Thomas and Lok (2015[57]) also conducted research on dispositions or personal attitudes that support the development and application of critical thinking skills and found similar results. These include being open-minded and fair-minded, truth-seeking and curious, and avoiding cultural- or trait-induced bias and dichotomous black-and-white thinking.

As young people spend more of their time on line and are engaged on social media platforms, they might be increasingly exposed to false and misleading information (Twenge, Martin and Spitzberg, 2019[58]). Survey data from the United Kingdom indicates that 10% of young people aged 8-17 view false information more than 6 times per day, and over half of young people are exposed to it daily (Cawthorne, 2021[59]). Since young people are still developing executive functions, they are generally less capable of self-regulating interactive media use (Burns and Gottschalk, 2020[60]). They might, therefore, be at particular risk when exposed to misleading content.

It is worth noting that caregivers significantly influence children’s media and digital technology exposure. They are typically the ones who introduce these technologies into children’s lives and teach them how to use them. Consequently, children often imitate their caregivers’ digital technology usage patterns as they initially engage with these technologies (Terras and Ramsay, 2016[61]). Evidence suggests that children of parents with lower levels of digital literacy have fewer resources available to develop their own digital/information/media literacy (Burns and Gottschalk, 2020[60]). Furthermore, data from the Programme for International Student Assessment (PISA) 2018 found that less advantaged pupils are more unclear on how to identify misinformation (OECD, 2021[26]; Suarez-Alvarez, 2021[62]). Even younger parents struggle with the nature of the digital world while being (mis)labelled as “digital natives”. Furthermore, teachers also play an important role in developing critical thinking skills in students. Successfully teaching critical thinking hinges on teachers’ attitudes and ability to create learning environments where students feel safe to take risks in their thinking and expressions.

This chapter discusses several important aspects related to information-processing skills and digital landscapes. First, it highlights the importance of achieving baseline proficiency in essential information-processing skills such as literacy, numeracy and science literacy for individuals to navigate complex information environments effectively. Next, it explores the dynamic nature of individuals’ proficiency levels in information processing and how extended engagement in digital cognitive tasks can lead to declines in these skills. The chapter also addresses key metacognitive skills, including self-awareness of one’s problem-solving abilities, the ability to evaluate the credibility of information sources and an understanding of the scientific process. The importance of trust in the scientific process is emphasised, particularly in situations where scientific advice may change, or disagreements among experts may arise. Furthermore, the chapter examines the current state of teaching strategies aimed at helping young people effectively navigate complex digital information landscapes. It considers existing guidelines and policy approaches as illustrative examples. Finally, the chapter concludes by discussing the implications of these findings for policy and practice.

In the digital information landscape, individuals must construct and validate knowledge from multiple sources, including scientific information and numerical data, which may vary in quality and have unknown origins. As such, individuals must possess a wide range of skills and be able to deploy them concomitantly to extract meaning and use from information. For example, text comprehension skills (reading literacy) are not sufficient to be able to evaluate digital text but should be accompanied by strong numeracy (mathematics) and scientific literacy. Unfortunately, many adults and young people fail to reach baseline proficiency levels in these skills. Thus, they are at a heightened risk of being unable to meaningfully process information in the form often presented to them. Although in practice, individuals are required to possess high levels of proficiency, the analyses presented here illustrate the share of adults and young people in OECD countries who fail to meet minimum baseline levels of proficiency in key information-processing skills. As such, these individuals are particularly vulnerable to making wrong assessments on the basis of information presented on and offline.

Proficiency in accessing, comprehending, and evaluating texts, as well as critical reasoning with mathematical content, and the effective use of digital technology for information acquisition, communication and practical tasks, are vital skills for navigating information-rich environments in the labour market and everyday life (OECD, 2013[63]). Figure 5.2 shows that across OECD and EU countries, Japan has the lowest share of adults with the lowest levels of literacy and numeracy proficiency (at or below Level 1) (4%), followed by the Czech Republic and Finland (8%). In contrast, in Peru, almost seven in ten adults score at the lowest level of proficiency in literacy and numeracy, and in Chile and Mexico, around five in ten adults do so.

Young people who fail to reach baseline levels of proficiency in reading, mathematics or science can, at most, solve tasks involving familiar contexts where all relevant information is present, and the tasks are clearly defined, performing actions that are almost always obvious and follow immediately from a given stimulus (OECD, 2019[25]). On average, 34% of 15-year-old students across OECD countries did not reach baseline proficiency levels in one or more key information-processing skills – reading, mathematics or science (Figure 5.3). Estonia (83%) and Japan (79%) have the highest share of 15-year-old students who are baseline all-rounders, meaning they reached baseline proficiency levels in all three domains. By contrast, in Colombia, only 29% of 15-year-old students were baseline all-rounders.

Individuals’ ability to process information, including tasks like understanding, using and interpreting written texts or accessing, using, interpreting and communicating mathematical information and ideas, is not static. It varies based on contextual factors, such as fatigue or motivation to solve a particular task. In information-rich societies, a key skill that allows individuals to engage in effective information processing is task persistence, defined as the capacity to maintain high levels of accuracy and willingness to engage in demanding cognitive tasks (Ryan and Deci, 2000[68]). Greater fatigue and tiredness lead to lower levels of persistence. Moreover, individuals with greater skills relevant to completing a task are more likely to persist in solving such a task. Similarly, it is crucial for individuals to be able to monitor and recognise their own level of cognitive fatigue and how this affects their information-processing skills. Information-processing abilities may in fact decline after engagement in prolonged cognitive effort and individuals, recognising the potential harmful effects of such decline, may act in ways that reduce their vulnerability to make suboptimal decisions (for example taking a break, postponing decisions or actions, and/or seeking the counsel of others).

In digital settings, task persistence is highly prized. Both children and adults spend an increasing amount of time using digital technologies performing long cognitive tasks (Fraillon et al., 2019[69]; Li et al., 2021[70]). Their day-to-day outcomes, i.e. how much they learn, how productive they are, and how well they are able to decipher the information they find on line, depend on their ability to maintain levels of accuracy over long periods or to recognise their cognitive limitations and take actions to address these. This could entail, for example, taking breaks or acknowledging that there is variability in how well they can deploy their information-processing skills, thereby organising their activities in ways that ensure that their peak levels of ability are devoted to the most challenging and consequential tasks.

Figure 5.4 indicates that, on average, across countries with available data that took part in the Survey of Adult Skills, individuals who took the online test correctly answered 60% of the test items when they were positioned in the first part of the test. However, when the same set of items was placed in the second part of the test, individuals’ accuracy dropped to 57%, indicating a difference of 3 percentage points. At the same time, countries differed with respect to the decline in accuracy experienced by individuals between items placed in the first and second halves of the assessment. In particular, in Ireland, accuracy declined from 58% of correct responses to 53%. Similarly, in England/Northern Ireland (United Kingdom), it declined from 59% to 55%; in France, it declined from 56% to 52%. By contrast, in the Slovak Republic, accuracy remained almost stable – at 60% correct responses – and in the Netherlands, it declined by less than 2 percentage points from 65% to 63%.

Differences in accuracy among adult populations appear to vary across socio-economic characteristics. For example, Figure 5.6 indicates that, on average, across the adult population surveyed, women experienced a larger decline in accuracy as a text progressed compared to men. Individuals with less than an upper secondary degree experienced significant declines in accuracy over the course of the PIAAC assessment, as did individuals who completed a tertiary degree. Interestingly, those who obtained an upper secondary degree saw, on average, less of a decrease in the percentage of correct responses between the first and second hour of the PIAAC assessment. Figure 5.5 also indicates that individuals coming from households in which neither parent had obtained a tertiary-level degree (a measure of socio-economic status) saw less of a decline in accuracy than individuals coming from households in which either one or both parents had obtained a tertiary-level degree.

Declines in levels of accuracy, when individuals are required to complete a long series of information-processing tasks, can also be identified among young people. Figure 5.6 reveals that when comparing the achievement of 15-year-old students who took part in the PISA 2018 assessment of science and mathematics, accuracy declined by 2.2 percentage points on average across OECD countries between the first and second hours of the assessment.1 Declines in accuracy were especially pronounced in Colombia, Australia and Norway, where declines were larger than 3 percentage points and smallest in Greece, Lithuania, Hungary and Finland, where declines were smaller than 1 percentage point.

A key skill to be able to navigate complex information landscapes is to be aware of the complexity of information-processing tasks, given one’s ability to process information. In 2018, 15-year-old students participating in the PISA assessment were asked to report if they felt they were able to understand difficult texts they had just been presented with in the context of the PISA reading assessment. Figure 5.7 reveals that, on average, across OECD countries, 67% of 15-year-old students reported being able to understand difficult texts. Among the top performers in reading – those with reading proficiency of Level 5 and above – 88% of students on average across OECD countries reported being able to understand difficult texts, while only 51% of students who, at most, had baseline levels of reading proficiency reported the same. Interestingly, the two countries where young people doubted their understanding the most were Japan and Korea, two countries with the highest reading achievement levels internationally. For example, in Japan, 28% of students reported being able to understand difficult texts; in Korea, 55% did.

Although fewer low achievers than high achievers reported being able to understand difficult texts, Figure 5.8 reveals that a large number of students have low levels of understanding of digital texts (i.e. they performed poorly in PISA reading tasks) and believe they have high levels of understanding of digital texts (i.e. they reported being able to understand difficult texts in the PISA reading assessment). In fact, on average, across OECD countries, 25% of 15-year-old students believe they can comprehend difficult texts even though their measured reading proficiency suggests the opposite. Overconfidence, i.e. the overestimation of one’s actual ability to perform a task successfully, is a well-established cognitive bias in psychological research (Kahneman and Tversky, 1996[72]). In the context of information processing, overconfident individuals could make suboptimal decisions for themselves, believing wrong information or misinterpreting correct information. At the same time, they could also potentially contribute to sharing wrong information with confidence or unwillingly creating and disseminating wrong information due to misinterpretation rather than malicious intent.

The share of young people who reported being able to understand difficult texts even though they performed, at most, at PISA proficiency Level 2 in the reading assessment was highest in Romania, where 48% of young people were overconfident in their reading skills and the lowest in Japan, where 7% of young people were. In Romania, Colombia, Mexico, Bulgaria, Malta and Costa Rica (listed in descending order), four or more 15-year-old students in ten reported being able to understand difficult texts even though they had low levels of achievement in reading. By contrast, in Portugal, Germany, Belgium, Canada, Poland, Ireland, Finland, Estonia, Korea and Japan, fewer than two in ten 15-year-old students reported being able to understand difficult texts even though they had low levels of achievement in reading.

In most countries, boys are more likely than girls to report being able to understand difficult texts, though they performed poorly on tasks involving understanding those texts (i.e. they achieved at most proficiency Level 2 in the PISA reading assessment). On average, across OECD countries, girls are almost 40% less likely than boys to be overconfident in their ability to understand difficult texts (Figure 5.9) and in Finland, they are almost 65% less likely. Only in Colombia, Italy, Mexico and Bulgaria (listed in descending order) are boys and girls equally likely to be overconfident in their ability to understand difficult texts.

Alongside exposure to information of variable quality, individuals are increasingly targeted by online fraud, malware and schemes designed to extract their personal data through phishing messages. Taking inappropriate action when being the subject of phishing attempts can put one’s own data at risk, as well as the data of others in one’s social network.

In the context of PISA testing, students were proposed a scenario in which they had to imagine receiving an email from a well-known mobile phone operator telling them they had won a smartphone. They were asked to click on a link to fill out a form with their data so that the smartphone could be sent to them. Next, they were asked how appropriate a series of actions would be, including: 1) answer the email and ask for more information about the smartphone; 2) check the sender’s email address; 3) delete the email without clicking on the link; 4) check the website of the mobile phone operator to see whether the smartphone offer was mentioned; and 5) click on the link to fill out the form as soon as possible. Clicking on the link to fill in the form as soon as possible was clearly inappropriate in the scenario. Yet, on average, across OECD countries, only around one in two 15-year-old students (49%) indicated that clicking on the link would not be appropriate when receiving a potential phishing email (Figure 5.10). The share of students who reported that it would not be at all appropriate to click on the link was highest in Denmark (82%) and lowest in Spain (27%).

An important challenge of emerging new phenomena is the high degree of uncertainty that surrounds decision making when situations are new and unknown. This was the case, for example, when the coronavirus (COVID-19) emerged in early 2020, and little was known about how it spread, how infectious it was, how dangerous it was and what treatment, if any, could be adopted if infected. This led to differences in the advice given by different scientists and decision makers and to changes in such advice over time. Such changes could be expected as information about the virus increased alongside the knowledge of its properties and effects. Trusting scientists when there is lack of agreement among experts and when the advice any one expert gives changes over time requires an understanding of how the scientific process unfolds, the nature of knowledge in science and believing in the validity of scientific methods of enquiry as a source of knowing. In science, scientific explanations are true only until proven wrong through experimentation.

In 2015, 15-year-old students participating in the PISA study were asked to report if they strongly agreed, agreed, disagreed or strongly disagreed with the following statements: “A good way to know if something is true is to do an experiment”; “Ideas in science sometimes change”; “Good answers are based on evidence from many different experiments”; “It is good to try experiments more than once to make sure of [your] findings”; “Sometimes scientists change their minds about what is true in science”; and “The ideas in science books sometimes change”. These statements are related to beliefs that scientific knowledge is tentative (to the extent that students recognise that scientific theories are not absolute truths but evolve over time) and to beliefs about the validity and limitations of empirical methods of enquiry as a source of knowing.

Figure 5.11 illustrates the mean index of students’ epistemic beliefs about science (left y-axis). On average, in 2015, 15-year-olds in Canada, Iceland and Portugal had especially high levels of epistemic beliefs in science, meaning that, on average, they were better able than students in other OECD countries to recognise that scientific knowledge is tentative and to understand the validity and limitations of empirical methods of enquiry as a source of knowing. By contrast, students in the Slovak Republic, Hungary and Romania had comparatively low levels of epistemic beliefs in science.

Since the index of epistemic beliefs in science is a composite index that allows for a meaningful comparison of countries but is difficult to interpret, Figure 5.11 also illustrates the percentage of 15-year-old students in each country who reported agreeing or strongly agreeing that “Sometimes scientists change their minds about what is true in science”. This statement is highly correlated with the overall index and describes the situation many young people found themselves in during the COVID-19 pandemic. On average, across OECD countries, 79% of young people agreed or strongly agreed that sometimes scientists change their minds about what is true in science. Portugal, Denmark, Canada and Korea are the countries with the highest percentage; in Portugal and Denmark, 89% of 15-year-old students agreed or strongly agreed that sometimes scientists change their minds about what is true in science; in Canada and Korea, 88% did. By contrast, in Hungary, Luxembourg, Austria, Romania and Germany, between 65% and 68% of 15-year-old students agreed or strongly agreed that sometimes scientists change their minds about what is true in science.

In addition to grasping the essence of scientific exploration, adopting specific attitudes towards science and scientists can assist individuals in navigating a complex and swiftly evolving information landscape where scientific facts and advice coexist with sources of misinformation and disinformation. One such attitude is trusting science and scientists.

In 2018, participants in the Wellcome Global Monitor were surveyed using a four-point Likert scale (“a lot”, “some”, “not much”, and “not at all”), where they were asked to rate the following statements: “How much do you trust scientists in this country?”; “In general, would you say that you trust science?”; “In general, how much do you trust scientists to find accurate information about the world?”; “Scientists work with the intention of benefiting the public”; and “Scientists are open and honest about who is paying for their work”.

Figure 5.12 illustrates average levels of a cross-country comparable trust in science index that was developed using information provided by respondents to the five statements and the percentage who reported that, in general, they trust scientists a lot to find accurate information about the world. On average, across OECD countries, levels of trust in science were highest in Finland, Australia and Norway and lowest in Korea, Colombia and Greece. The percentage of the adult population who reported that, in general, they trust scientists a lot to find accurate information about the world ranged from 15% in Korea and 68% in Spain, with an average across OECD countries of 43%.

In countries where young people were more likely to believe that scientific knowledge is tentative and understand the validity and limitations of empirical methods of enquiry as a source of knowing, adults also had higher levels of trust in science and scientists (Figure 5.13). These results are correlational but suggest that trust in science and scientists is higher in countries with a more solid understanding of the nature of scientific knowledge.

Figure 5.14 uses data from the 63 countries with data on epistemic beliefs in science collected in the PISA 2015 study and estimates of excess mortality due to COVID-19 modelled by The Economist to illustrate the correlation between epistemic beliefs and adherence to pharmaceutical and non-pharmaceutical interventions that could lower excess mortality. While the first phase of the pandemic witnessed considerable discrepancies across countries in terms of knowledge gaps and accessibility to face masks, vaccines and other preventive measures to curb the virus spread among vulnerable populations, 2021 brought a relaxation of constraints on the adoption of pharmaceutical and non-pharmaceutical interventions in many countries. Figure 5.14 reveals that in countries where a larger share of young people before the pandemic indicated an understanding that scientific knowledge is tentative, as well as the validity and limitations of empirical methods of enquiry as a source of knowing, fewer excess deaths due to COVID-19 in 2021 were recorded.

In its 2018 round, PISA surveyed 15-year-old students to assess various methods young people have been taught to deal with digital information. On average, across OECD countries, 69% of 15-year-old students have been taught strategies on “how to decide whether to trust information from the Internet”, with the highest percentage of students taught how to do this in Sweden (92%) and the lowest percentage in Poland (39%) (Figure 5.15). In addition, on average across OECD countries, 76% of 15-year-old students have been taught strategies on how “to understand the consequences of making information publicly available on line on Facebook, Instagram, etc.” The highest percentage of students taught such strategies was in the United Kingdom (90%), and the lowest was in Korea (46%). On average, 54% of 15-year-old students in OECD countries have been taught strategies on “how to detect whether information is subjective or biased”. The United States has the highest percentage of students who have been taught strategies to deal with subjective or biased information (79%), while Latvia has the lowest percentage (38%). Finally, 41% of students aged 15 in OECD countries have been taught strategies on “how to detect phishing or spam emails”. Malta has the highest percentage of students taught this (76%), whereas Norway has the smallest percentage of students taught such strategies (22%).

Figure 5.16 shows the percentage of students aged 15 that have been taught strategies to deal with digital information by economic, social and cultural status (ESCS). On average, across OECD countries, 71% of 15-year-old students with high ESCS have been taught strategies on “how to decide whether to trust information from the Internet” versus 68% of students with low ESCS. In addition, 77% of 15-year-old students within the group with high ESCS have been taught strategies on “how to understand the consequences of making information publicly available on line on Facebook, Instagram, etc.” versus 74% of students with low ESCS. Further, 58% of 15-year-old students within the group with high ESCS have been taught strategies related to detecting whether information is subjective or biased compared to 50% of 15-year-old students within the group with low ESCS. Finally, both the high and low ESCS groups had an equal percentage (41%) of 15-year-old students taught strategies for detecting phishing or spam emails.

With the rise of social media, the decline of traditional news outlets and the development of new deep-fake and AI-generative technologies, there has been a proliferation of information available to individuals around the world. However, the increased quantity of information available has not been accompanied by an increase in the quality of such information. Instead, there has been an increase in the amount of false or misleading information individuals are routinely exposed to due to the proliferation of misinformation, disinformation and malinformation. Most people in OECD countries worry about being exposed to false or misleading information.

The persistence of individuals’ beliefs in false and inaccurate information can pose challenges in terms of changing those beliefs, leading to a breakdown of social cohesion and undermining the effectiveness of policy action, as exemplified during the pandemic. In fact, beliefs in false and inaccurate information can persist and even sometimes be reinforced when individuals are exposed to corrective information (Swire-Thompson, DeGutis and Lazer, 2020[76]). Given that beliefs in false or misleading information can persist and that we have entered what some have named a “post-truth” era, this chapter considered the extent to which individuals in OECD countries have developed some of the skills needed to reduce their vulnerability to false and misleading information. The chapter argues that alongside high levels of information-processing abilities, individuals will need high levels of metacognitive skills and an awareness of the limits of human cognition.

Mapping the distribution of these skills in the population is critical to empowering people and communities. On the one hand, identifying how many and who is most vulnerable can be used to target policy action. On the other hand, mapping the “scale of the problem” can be used to mobilise structural efforts to reduce exposure to false and misleading information and promote a high-quality information landscape. Even when individuals could theoretically use their skills to identify the source and veracity of information, doing so systematically would be practically impossible for individuals. Implementing measures to enhance people’s information-processing skills, such as utilising technology to clearly indicate the source of online information, similar to the traceability chain used for food products, would significantly empower information users.

The results reported in this chapter indicate that one in four students is at high risk of believing misinformation due to their excessive confidence in their information retrieval and inference skills, despite their low proficiency in reading. Similarly, almost one in three students with mathematics proficiency at Level 2 or below report knowing a concept in maths that does not exist. High levels of information-processing skills, such as high levels of reading, maths and science, should be accompanied by the habit (or skill) of critically evaluating one’s understanding and knowledge when accessing complex information of variable quality on line. The large between-country variation in the share of young people who have low levels of information-processing skills but high levels of confidence in their abilities suggests that it is possible to reduce this source of vulnerability to low-quality information. Crucially, because individuals can both receive and actively participate in spreading information, reducing overconfidence can reduce both individual and broader societal vulnerabilities. Building people’s information-processing skills should be accompanied by efforts to promote their ability to reflect critically on their understanding of complex information.

The data also reveal that on average, across OECD countries, 69% of 15-year-old students report having been taught strategies on “how to decide whether to trust information from the Internet”; 76% have been taught strategies on how “to understand the consequences of making information publicly available on line on Facebook, Instagram, etc.”; 54% have been taught strategies on “how to detect whether the information is subjective or biased”; and 41% have been taught strategies on “how to detect phishing or spam emails”. These results suggest that there is considerable room to further develop strategies to empower young people with the set of specific skills that could help them stop consuming and spreading false or misleading information.

Across OECD countries, many young people do not have a good understanding of the scientific process and the validity of different forms of information-gathering processes. In fact, as many as 21% of 15-year-old students indicate disagreeing or strongly disagreeing that sometimes scientists change their minds about what is true in science. Learning this lesson would develop trust in the changing position of science and scientists on specific facts. Trust is essential to any information and communication exchange.

Although a defining characteristic of trust is that trust is, prima facie, a source of vulnerability, trust should not be equated with gullibility. Trust, whether in science, interpersonal relationships, or governments and institutions, is based on the cognitive ability of individuals to discern the trustworthiness of particular individuals or institutions within specific contexts. Information-processing abilities empower people to perform better at the problem-solving tasks represented by information and communication exchanges, thus reducing the likelihood of misplaced trust (Borgonovi and Pokropek, 2022[77]). Knowing how the scientific process works is critical for individuals to understand why scientists can change their advice and recommendations when new evidence emerges that disproves previously held ideas. This, in turn, is critical to sustain and maintain trust in scientific information even though scientists may not agree with each other or may change their views over time.

Finally, analyses suggest that empowering individuals to become better users and producers of information would require abandoning a “fixed” view of individuals’ skills and abilities and considering that individuals can differ in their capacity to deploy their skills in different situations and in response to environmental stimuli. This is critical because individuals are required to spend an increasing amount of time processing complex information off and on line. However, their ability to accurately process and use such information to perform different tasks declines as they grow tired and fatigued. Students and adults should develop an awareness of the fact that their proficiency is not fixed, but rather that it depends on their cognitive exhaustion and can depend on other circumstances they face in their environment. Building task persistence should be a priority, but so should promoting awareness about fluctuations in information-processing abilities. Developing an awareness of when one’s accuracy in processing information is high and when it is low is critical to ensuring that individuals are empowered to make decisions on consequential matters or perform difficult tasks when their cognitive abilities are at their highest (when they are refreshed and have taken a break).

Media literacy was identified as a key component to managing misinformation. However, addressing the societal challenge related to false and misleading content cannot be achieved through media literacy alone. Instead, the simultaneous combination of multiple “literacies” brought together under a clear framework is required for societies to function in a rapidly changing media environment (Jones-Jang, Mortensen and Liu, 2019[29]). Furthermore, media/digital/information literacy education is not a “silver bullet” to solving the disinformation challenge (Jang et al., 2018[78]). Literacy development is only a part of a broader suite of policies that can help countries respond to the threat of false and misleading information. Policy options must consider each country’s social context, including differing legal systems, precedents and approaches to the protection of freedom of speech.

References

[20] Ankel, S. (2020), Law enforcement officials fear that the US will see an increase in arson and violence linked to 5G conspiracy theories, according to reports, https://www.businessinsider.com/coronavirus-violence-feared-as-5g-conspiracy-theories-reach-us-abc-2020-5?r=US&IR=T.

[14] Barthel, M., A. Mitchell and J. Holcomb (2016), Many Americans Believe Fake News Is Sowing Confusion, Pew Research Center, Washington, DC, https://www.pewresearch.org/journalism/2016/12/15/many-americans-believe-fake-news-is-sowing-confusion/.

[18] Bawden, D. and L. Robinson (2020), Information Overload: An Introduction, Oxford University Press, Oxford, https://doi.org/10.1093/acrefore/9780190228637.013.1360.

[23] Benegal, S. and L. Scruggs (2018), “Correcting misinformation about climate change: The impact of partisanship in an experimental setting”, Climatic Change, Vol. 148/1-2, pp. 61-80, https://doi.org/10.1007/s10584-018-2192-4.

[43] Biron, C. (2023), “US schools teach ’media literacy’ to fight online disinformation”, Context, https://www.context.news/big-tech/us-schools-teach-media-literacy-to-fight-online-disinformation (accessed on 22 August 2023).

[19] Borges do Nascimento, I. et al. (2022), “Infodemics and health misinformation: A systematic review of reviews”, Bulletin of the World Health Organization, Vol. 100/9, pp. 544-561, https://doi.org/10.2471/blt.21.287654.

[77] Borgonovi, F. and A. Pokropek (2022), “The role of birthplace diversity in shaping education gradients in trust: Country and regional level mediation-moderation analyses”, Social Indicators Research, Vol. 164/1, pp. 239-261, https://doi.org/10.1007/s11205-022-02948-z.

[74] Borgonovi, F. and A. Pokropek (2020), Can we rely on trust in science to beat the COVID-19 pandemic?, Center for Open Science, https://doi.org/10.31234/osf.io/yq287.

[60] Burns, T. and F. Gottschalk (eds.) (2020), Education in the Digital Age: Healthy and Happy Children, Educational Research and Innovation, OECD Publishing, Paris, https://doi.org/10.1787/1209166a-en.

[59] Cawthorne, B. (2021), Safer Internet Day Press Release 2021, https://saferinternet.org.uk/blog/safer-internet-day-press-release-2021.

[21] Cerulus, L. (2020), 5G arsonists turn up in continental Europe, https://www.politico.com/news/2020/04/26/5g-mast-torchers-turn-up-in-continental-europe-210736.

[54] Chapman, M. (2016), Mapping of Media Literacy Practices and Actions in EU-28, European Audiovisual Observatory, Strasbourg, https://rm.coe.int/1680783500.

[3] Chesney, R. and D. Citron (2018), “21st century-style truth decay: Deep fakes and the challenge for privacy, free expression, and national security”, Md. L. Rev, Vol. 78, https://digitalcommons.law.umaryland.edu/cgi/viewcontent.cgi?article=3834&context=mlr.

[13] Corbu, N. et al. (2020), “‘They can’t fool me, but they can fool the others!’ Third person effect and fake news detection”, European Journal of Communication, Vol. 35/2, pp. 165-180, https://doi.org/10.1177/0267323120903686.

[31] Cortesi, S. et al. (2020), “Youth and digital citizenship+ (plus): Understanding skills for a digital world”, SSRN Electronic Journal, https://doi.org/10.2139/ssrn.3557518.

[27] d’Ancona, M. (2017), Post-Truth: The New War on Truth and How to Fight Back, Random House, New York.

[4] Edelman (2023), 2023 Edelman Trust Barometer Global Report, https://www.edelman.com/trust/2023/trust-barometer.

[5] Edelman (2022), 2022 Edelman Trust Barometer Global Report, https://www.edelman.com/trust/2022-trust-barometer.

[41] Education Commission of the States (2021), “Response to information request: Which states have passed legislation about requiring media literacy and/or civics and what those bills”, Education Commission of the States, https://www.ecs.org/wp-content/uploads/State-Information-Request_Media-Literacy.pdf (accessed on 22 August 2023).

[53] Edwards, L. et al. (2021), Rapid Evidence Assessment on Online Misinformation and Media Literacy: Final Report for OFCOM, https://www.ofcom.org.uk/__data/assets/pdf_file/0011/220403/rea-online-misinformation.pdf.

[38] Finnish National Agency for Education (2022), Digital Competences and Capacities in Youth Work, https://www.oph.fi/en/statistics-and-publications/publications/report-digital-competences-and-capacities-youth-work.

[69] Fraillon, J. et al. (2019), Preparing for Life in a Digital World: IEA International Computer and Information Literacy Study 2018 - International Report, International Association for the Evaluation of Educational Achievement (IEA), Amsterdam, https://www.iea.nl/sites/default/files/2019-11/ICILS%202019%20Digital%20final%2004112019.pdf.

[30] Huguet, A. et al. (2019), Exploring Media Literacy Education as a Tool for Mitigating Truth Decay, RAND Corporation, Santa Monica, CA, https://doi.org/10.7249/rr3050.

[1] Huijstee, M. et al. (2022), Tackling Deepfakes in European Policy, European Parliament, https://doi.org/10.2861/325063.

[48] Illinois State Board of Education (2022), Media Literacy: Public Act 102-0055, https://www.isbe.net/Documents/Media-Literacy-Fact-Sheet.pdf (accessed on 22 August 2023).

[78] Jang, S. et al. (2018), “A computational approach for examining the roots and spreading patterns of fake news: Evolution tree analysis”, Computers in Human Behavior, Vol. 84, pp. 103-113, https://doi.org/10.1016/j.chb.2018.02.032.

[29] Jones-Jang, S., T. Mortensen and J. Liu (2019), “Does media literacy help identification of fake news? Information literacy helps, but other literacies don’t”, American Behavioral Scientist, Vol. 65/2, pp. 371-388, https://doi.org/10.1177/0002764219869406.

[72] Kahneman, D. and A. Tversky (1996), “On the reality of cognitive illusions.”, Psychological Review, Vol. 103/3, pp. 582-591, https://doi.org/10.1037/0033-295x.103.3.582.

[36] Kansallinen audiovisuaalinen instituutti (2021), Finnish Media Education, https://kavi.fi/wp-content/uploads/2021/01/Finnish-Media-Education.pdf (accessed on 23 May 2023).

[45] LegiScan (2023), New Jersey Senate Bill 588, https://legiscan.com/NJ/text/S588/id/2610908 (accessed on 22 August 2023).

[34] Lessenski, M. (2022), How It Started, How It is Going: Media Literacy Index 2022, https://osis.bg/?p=4243&lang=en.

[32] Livingstone, S. (2003), “The changing nature and uses of media literacy”, Media@LSE Electronic Working Papers, No. 4, Media@lse, London School of Economics and Political Science, London, UK, http://eprints.lse.ac.uk/13476/.

[70] Li, Y. et al. (2021), “Internet addiction increases in the general population during COVID‐19: Evidence from China”, The American Journal on Addictions, Vol. 30/4, pp. 389-397, https://doi.org/10.1111/ajad.13156.

[16] Lyons, B. et al. (2021), “Overconfidence in news judgments is associated with false news susceptibility”, Proceedings of the National Academy of Sciences, Vol. 118/23, https://doi.org/10.1073/pnas.2019527118.

[55] Machete, P. and M. Turpin (2020), “The Use of Critical Thinking to Identify Fake News: A Systematic Literature Review”, in Hattingh, M. et al. (eds.), Responsible Design, Implementation and Use of Information and Communication Technology, I3E 2020, Lecture Notes in Computer Science, Vol 12067, Springer International Publishing, Cham, https://doi.org/10.1007/978-3-030-45002-1_20.

[37] Mackintosh, E. (2019), Finland is winning the war on fake news. What it’s learned may be crucial to Western democracy, https://edition.cnn.com/interactive/2019/05/europe/finland-fake-news-intl/.

[42] Media Literacy Now (2023), “U.S. Media Literacy Policy Report 2022”, Media Literacy Now, https://medialiteracynow.org/wp-content/uploads/2023/05/MediaLiteracyPolicyReport2022.pdf (accessed on 22 August  2023).

[40] mediataitoviikko (2023), Media Literacy Week celebrates diversity in creating and developing a better media environment for all, https://www.mediataitoviikko.fi/in-english/ (accessed on 27 April  2023).

[51] Ministry of Education, Culture, Sports, Science, and Technology (2020), New Curriculum Standards, http://www.mext.go.jp/a_menu/shotou/new-cs/1384661.htm (accessed on 14 August 2023).

[15] Moravec, P., R. Minas and A. Dennis (2018), “Fake news on social media: People believe what they want to believe when it makes no sense at all”, SSRN Electronic Journal, https://doi.org/10.2139/ssrn.3269541.

[50] Nakahashi, Y. (2015), “Media Katsuyo to Literacy No Ikusei [Practical Use of the Media and Development of Literacy]”, Hoso Media Kenkyu [Broadcast Media Studies].

[17] OECD (2022), “Misinformation and disinformation: An international effort using behavioural science to tackle the spread of misinformation”, OECD Public Governance Policy Papers, No. 21, OECD Publishing, Paris, https://doi.org/10.1787/b7709d4f-en.

[26] OECD (2021), 21st-Century Readers: Developing Literacy Skills in a Digital World, PISA, OECD Publishing, Paris, https://doi.org/10.1787/a83d84cb-en.

[25] OECD (2019), PISA 2018 Results (Volume I): What Students Know and Can Do, PISA, OECD Publishing, Paris, https://doi.org/10.1787/5f07c754-en.

[66] OECD (2019), Survey of Adult Skills (PIAAC) databases, http://www.oecd.org/skills/piaac/publicdataandanalysis/.

[67] OECD (2018), PISA database 2018, http://www.oecd.org/pisa/data/2018database/.

[71] OECD (2017), Programme for the International Assessment of Adult Competencies (PIAAC), Log Files, GESIS Data Archive, Cologne, https://doi.org/10.4232/1.12955.

[65] OECD (2015), PISA database 2015, https://www.oecd.org/pisa/data/2015database/.

[63] OECD (2013), Technical Report of the Survey of Adult Skills (PIAAC), OECD Publishing, Paris, https://www.oecd.org/skills/piaac/_Technical%20Report_17OCT13.pdf.

[64] OECD (2012), PISA database 2012, https://www.oecd.org/pisa/pisaproducts/pisa2012database-downloadabledata.htm.

[11] Papapicco, C., I. Lamanna and F. D’Errico (2022), “Adolescents’ vulnerability to fake news and to racial hoaxes: A qualitative analysis on Italian sample”, Multimodal Technologies and Interaction, Vol. 6/3, p. 20, https://doi.org/10.3390/mti6030020.

[22] Pasley, J. (2020), 17 cell phone towers in New Zealand have been vandalized since the lockdown, coinciding with a boom in 5G conspiracy theories, http://businessinsider.in/international/news/17-cell-phone-towers-in-new-zealand-have-been-vandalized-since-the-lockdown-coinciding-with-a-boom-in-5g-conspiracy-theories/articleshow/75833003.cms.

[12] Pennycook, G. et al. (2021), “Shifting attention to accuracy can reduce misinformation online”, Nature, Vol. 592/7855, pp. 590-595, https://doi.org/10.1038/s41586-021-03344-2.

[9] Pennycook, G. and D. Rand (2019), “Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning”, Cognition, Vol. 188, pp. 39-50, https://doi.org/10.1016/j.cognition.2018.06.011.

[47] Pirlo, F. and D. Novak (2023), “Media, Internet Literacy Needs Increase in US Schools”, VOX, https://learningenglish.voanews.com/a/media-internet-literacy-needs-increase-in-us-schools/7014620.html (accessed on 22 August 2023).

[33] Potter, W. (2010), “The state of media literacy”, Journal of Broadcasting & Electronic Media, Vol. 54/4, pp. 675-696, https://doi.org/10.1080/08838151.2011.521462.

[68] Ryan, R. and E. Deci (2000), “Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being”, American Psychologist, Vol. 55/1, pp. 68–78, https://doi.org/10.1037/0003-066X.55.1.68.

[7] Sadeghi, M. and L. Arvanitis (2023), Rise of the newsbots: AI-generated news websites proliferating online, https://www.newsguardtech.com/special-reports/newsbots-ai-generated-news-websites-proliferating/ (accessed on 10 May 2023).

[39] Salomaa, S. and L. Palsa (2019), Media Literacy in Finland: National Media Education Policy, Ministry of Education and Culture, https://julkaisut.valtioneuvosto.fi/bitstream/handle/10024/162065/OKM_2019_39.pdf?sequence=1&isAllowed=y.

[44] Sitrin, C. (2023), “New Jersey becomes first state to mandate K-12 students learn information literacy”, POLITICO, https://www.politico.com/news/2023/01/05/new-jersey-is-the-first-state-to-mandate-k-12-students-learn-information-literacy-00076352 (accessed on 22 August 2023).

[35] Standish, R. (2017), “Why is Finland better at fending off Russian-linked fake news?”, Toronto Star, https://www.thestar.com/news/world/2017/03/01/why-is-finland-better-at-fending-off-russian-linked-fake-news.html (accessed on 27 April 2022).

[46] State of New Jersey (2023), Governor Murphy Signs Bipartisan Legislation Establishing First in the Nation K-12 Information Literacy Education, https://www.nj.gov/governor/news/news/562022/20230104b.shtml (accessed on 22 August 2023).

[62] Suarez-Alvarez, J. (2021), “Are 15-year-olds prepared to deal with fake news and misinformation?”, PISA in Focus, No. 113, OECD Publishing, Paris, https://doi.org/10.1787/6ad5395e-en.

[52] Suzuki, K. (2008), “Developmentof Media Education in Japan”, Educational Technology Research and Development, Vol. 31, pp. 1-12, https://www.jstage.jst.go.jp/article/etr/31/1-2/31_KJ00005101190/_pdf/-char/en.

[76] Swire-Thompson, B., J. DeGutis and D. Lazer (2020), “Searching for the backfire effect: Measurement and design considerations.”, Journal of Applied Research in Memory and Cognition, Vol. 9/3, pp. 286-299, https://doi.org/10.1016/j.jarmac.2020.06.006.

[2] Ternovski, J., J. Kalla and P. Aronow (2021), “Deepfake warnings for political videos increase disbelief but do not improve discernment: Evidence from two experiments”, OSF Preprints, https://doi.org/10.31219/osf.io/dta97.

[61] Terras, M. and J. Ramsay (2016), “Family digital literacy practices and children’s mobile phone use”, Frontiers in Psychology, Vol. 7, https://doi.org/10.3389/fpsyg.2016.01957.

[75] The Economist (2021), Tracking covid-19 excess deaths across countries, https://www.economist.com/graphic-detail/coronavirus-excess-deaths-tracker.

[57] Thomas, K. and B. Lok (2015), “Teaching Critical Thinking: An Operational Framework”, in The Palgrave Handbook of Critical Thinking in Higher Education, Palgrave Macmillan US, New York, https://doi.org/10.1057/9781137378057_6.

[24] Tucker, J. et al. (2018), “Social media, political polarization, and political disinformation: A review of the scientific literature”, SSRN Electronic Journal, https://doi.org/10.2139/ssrn.3144139.

[58] Twenge, J., G. Martin and B. Spitzberg (2019), “Trends in U.S. adolescents’ media use, 1976–2016: The rise of digital media, the decline of TV, and the (near) demise of print”, Psychology of Popular Media Culture, Vol. 8/4, pp. 329-345, https://doi.org/10.1037/ppm0000203.

[49] UNESCO (2020), “Media and Information Literacy Education in Asia: Exploration of policie sand practices in Japan, Thailand, Indonesia, Malaysia, and the Philippines”, United Nations Educational, Scientific and Cultural Organization (UNESCO), https://unesdoc.unesco.org/in/documentViewer.xhtml?v=2.1.196&id=p::usmarcdef_0000374575&file=/in/rest/annotationSVC/DownloadWatermarkedAttachment/attach_import_ff77a3ff-cd5a-4703-a513-75731059789f%3F_%3D374575eng.pdf&locale=en&multi=true&ark=/ark:/48223/p (accessed on 14 August 2023).

[28] Valverde-Berrocoso, J., A. González-Fernández and J. Acevedo-Borrega (2022), “Disinformation and multiliteracy: A systematic review of the literature”, Comunicar, Vol. 30/70, pp. 97-110, https://doi.org/10.3916/c70-2022-08.

[56] Vardi, I. (2015), “The Relationship between Self-Regulation, Personal Epistemology, and Becoming a “Critical Thinker”: Implications for Pedagogy”, in The Palgrave Handbook of Critical Thinking in Higher Education, Palgrave Macmillan US, New York, https://doi.org/10.1057/9781137378057_13.

[10] Vosoughi, S., D. Roy and S. Aral (2018), “The spread of true and false news online”, Science, Vol. 359/6380, pp. 1146-1151, https://doi.org/10.1126/science.aap9559.

[8] Wardle, C. and H. Derakhshan (2017), Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making, Council of Europe, https://edoc.coe.int/en/media/7495-information-disorder-toward-an-interdisciplinary-framework-for-research-and-policy-making.html.

[73] Wellcome Trust (2018), Wellcome Global Monitor 2018, https://wellcome.org/reports/wellcome-global-monitor/2018.

[6] World Risk Poll (2019), The Lloyd’s Register Foundation World Risk Poll Report 2019, https://wrp.lrfoundation.org.uk/.

Note

← 1. Declines in levels of accuracy between the Survey of Adult Skills (PIAAC) and the PISA study cannot be compared because the two tests differ in terms of items as well as in test administration conditions. The PISA test is timed and lasts two hours, whereas the Survey of Adult Skills (PIAAC) is untimed (it is designed to last around 40 minutes, but participants can take as much time as they wish to complete the test). Results only for mathematics and science are considered for the PISA 2018 assessment, even though reading was the main domain due to the adaptive design adopted for the reading assessment.

Legal and rights

This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Extracts from publications may be subject to additional disclaimers, which are set out in the complete version of the publication, available at the link provided.

© OECD 2023

The use of this work, whether digital or print, is governed by the Terms and Conditions to be found at https://www.oecd.org/termsandconditions.