Development action with informed and engaged societies
As of March 15 2025, The Communication Initiative (The CI) platform is operating at a reduced level, with no new content being posted to the global website and registration/login functions disabled. (La Iniciativa de Comunicación, or CILA, will keep running.) While many interactive functions are no longer available, The CI platform remains open for public use, with all content accessible and searchable until the end of 2025. 

Please note that some links within our knowledge summaries may be broken due to changes in external websites. The denial of access to the USAID website has, for instance, left many links broken. We can only hope that these valuable resources will be made available again soon. In the meantime, our summaries may help you by gleaning key insights from those resources. 

A heartfelt thank you to our network for your support and the invaluable work you do.
Time to read
5 minutes
Read so far

Toolbox of Individual-level Interventions against Online Misinformation

0 comments
Affiliation

Max Planck Institute for Human Development - plus see below for full authors' affiliations

Date
Summary

"In its capacity to damage public health and the health of democracies, online misinformation poses a major policy problem..."

One approach to addressing the spread of online misinformation involves content moderation at the platform level, which risks impinging on freedom of expression. Another approach to mitigating the effect of misinformation focuses on individual-level interventions that are designed to act on people's cognition and behaviour to reduce either their tendency to share misinformation or the extent to which they are affected by it. This paper introduces a toolbox of individual-level interventions for reducing harm from online misinformation. It is accompanied by a detailed online research and policy resource designed for a broad audience: a toolbox of interventions against online misinformation.

This paper represents a collaborative effort of an international group of experts from 25 institutions and universities. The main goal is to identify a collection of empirically validated cognitive and behavioural interventions against misinformation that target different behavioural and cognitive outcomes. The intervention types in the toolbox explicitly tackle the challenge of misinformation, encompassing disinformation, false and misleading information, fake news, and related issues. The toolbox does not evaluate the interventions' potential to be implemented; nor does it analyse their comparative effectiveness.

The toolbox described here focuses on two points of interest: a conceptual overview of the interventions and an overview of the empirical evidence supporting the interventions, including the methods used to test them. Both overviews are publicly available as an online supplement in the form of two dynamic tables: a conceptual overview and an evidence overview. The online supplement also contains selected examples of interventions and a world map of evidence.

The toolbox includes nine types of individual-level interventions, all supported by peer-reviewed, published evidence:
 

  1. Accuracy prompts encourage users to consider the veracity of information before sharing it.
  2. Friction involves introducing small obstacles, such as requiring users to read an article, before sharing it.
  3. Social norms emphasise that most people do not share information and encourages others to follow positive behaviours.
  4. Inoculation exposes people to a weakened version of misleading arguments to build resistance to similar claims in the future.
  5. Lateral reading and verification strategies encourage users to check multiple sources to quickly assess the credibility of content.
  6. Media-literacy tips can help users critically evaluate online information.
  7. Debunking and rebuttals directly correct false claims by providing accurate information.
  8. Warning and fact-checking labels flag potentially false or misleading content to prompt users to think twice before accepting or sharing it.
  9. Source-credibility labels add credibility ratings of news sources.

These nine types of interventions fall under three intervention categories: nudges, which target behaviours; boosts and educational interventions, which target competences; and refutation strategies, which target beliefs. Note that interventions may fall under more than one category. Table 1 in the paper provides a condensed overview of the intervention types, listed by policy intervention category.

The toolbox also provides a summary of the evidence behind the nine types of interventions. This part of the toolbox is based on 81 scientific papers and is available online as a searchable and expandable table here https://interventionstoolbox.mpib-berlin.mpg.de/table_evidence.html. The table includes several empirical papers for each intervention, as well as an overview of each paper's sample, experimental paradigm, study design, outcome measures, main findings, and longevity tests. Expanding the row associated with a paper reveals more detailed information about methods and effect sizes, the full reference, a link to open data (if available), and the abstract. A separate section of the toolbox maps these empirical papers to the countries in which they were conducted.

Several observations about the current state of the literature can be derived from this evidence overview"

  • Evidence from the Global North is overrepresented for many intervention types in the toolbox. However, several intervention types have been tested across the globe. Although the interventions target universal problems and behaviours, they are sensitive to cultural differences. There is a potentially complex relationship between people's cultural contexts and their competences and behaviours vis-à-vis misinformation. Future studies are needed to examine this potentially intricate relationship in more detail.
  • Few studies have tested the long-term effects of interventions. The mechanisms behind the longevity or lack thereof of interventions' effects are currently poorly understood.
  • It is difficult to compare interventions due to variability in how their effectiveness was studied. The core differences relate to participants' tasks - in particular, the paradigm used, including the test stimuli (for example, news headlines, real-world claims, or websites), and the measured outcome variables (for example, belief or credibility ratings and behavioural measures). Ongoing efforts to systematically compare interventions in large-scale megastudies would benefit from appropriate standards for paradigms, test stimuli, and outcome measures. As a first step, the paper identifies four main paradigms in research on misinformation interventions:
    • The misinformation-correction paradigm, which typically presents people with corrections and measures the effect on their belief in relevant misinformation claims, their claim-related inferential reasoning, and their attitudes towards associated issues. Key outcome variables in this paradigm include belief and attitude ratings, as well as reliance on misinformation when responding to inferential-reasoning questions.
    • The headline-discernment paradigm, whereby participants evaluate the plausibility, credibility, or veracity of true and false headlines and indicate whether they would be willing to share them. Outcome variables include accuracy ratings and people's ability to discern true and false headlines, as well as measures of sharing intention and sharing discernment (that is, the difference in sharing true versus false headlines).
    • The technique-adoption paradigm, which assesses whether participants learn the skills and strategies required to evaluate information veracity. Outcome variables include assessment scores for demonstrating the skills learned during the intervention (for example, identifying an information source and assessing its credibility), and stimuli include online articles or entire websites.
    • The emerging paradigm of field studies on social media, where outcome variables may include the quality of people's information diets, the quality of what people share on their newsfeeds, and people's accuracy or sharing discernment.

The international group of experts behind this initiative envision several uses for the toolbox:

  • For researchers, it provides a starting point for meta-analytic studies, systematic reviews and studies comparing the effectiveness of different interventions. It can also inform efforts to standardise and coordinate methods, thereby increasing the comparability of future results. Furthermore, the toolbox highlights gaps in the available evidence (for example, underrepresented populations and cultures) that should be addressed in future studies.
  • For policymakers and the public, the toolbox is meant to provide accessible, up-to-date scientific knowledge that can inform policy discussions about misinformation countermeasures and platform regulation. The toolbox can also be used as a resource for educational programmes. Where a single intervention may have only limited effects, the toolbox can helps policymakers, educators, and the public (along with researchers) to combine interventions to address different aspects of a misinformation problem.

Per the authors, "[t]o understand the factors that influence the spread of misinformation, data access and collaborative efforts between researchers and platforms are crucial. Individual-focused interventions can only go so far in the face of complex global threats such as misinformation. Whereas individual-level approaches aim at mitigating misinformation by acting on individuals' ability to recognize and not spread falsehoods, system-level approaches can aim at making the entire online ecosystem less conducive to the spread of misinformation - for instance, through platform design, content moderation, communities of fact-checkers and journalists, and high-level regulatory and policy interventions (such as investing in public broadcasters and establishing regulatory frameworks that promote a diverse media landscape). System-level interventions may be particularly effective and long-lasting."

Full list of authors, with institutional affiliations: Anastasia Kozyreva, Max Planck Institute for Human Development; Philipp Lorenz-Spreen, Max Planck Institute for Human Development; Stefan Herzog, Max Planck Institute for Human Development; Ullrich Ecker, University of Western Australia; Stephan Lewandowsky, University of Bristol and University of Potsdam; Ralph Hertwig, Max Planck Institute for Human Development; Ayesha Ali, Lahore University of Management Sciences; Joe Bak-Coleman, Columbia University; Sarit Barzilai, University of Haifa; Melisa Basol, University of Cambridge; Adam J. Berinsky, Massachusetts Institute of Technology; Cornelia Betsch, University of Erfurt and Bernhard Nocht Institute for Tropical Medicine; John Cook, Monash University; Lisa K. Fazio, Vanderbilt University; Michael Geers, Max Planck Institute for Human Development and Humboldt University of Berlin; Andrew M. Guess, Princeton University; Huang Haifeng, Ohio State University; Horacio Larreguy, Instituto Tecnológico Autónomo de México; Rakoen Maertens, University of Cambridge; Folco Panizza, IMT School for Advanced Studies Lucca; Gordon Pennycook, University of Regina; David J. Rand, Massachusetts Institute of Technology; Steve Rathje, New York University; Jason Reifler, University of Exeter; Philipp Schmid, University of Erfurt, Bernhard Nocht Institute for Tropical Medicine, and Radboud University Nijmegen; Mark Smith, Stanford University; Briony Swire-Thomson, Northeastern University; Paula Szewach, University of Exeter and University of Essex; Sander van der Linden, University of Cambridge; Sam Wineburg, Stanford University

Click here to access the toolkit as an online supplement that also contains selected examples of interventions.

Source

Nature Human Behaviour, Volume 8, June 2024, 1044-1052 https://doi.org/10.1038/s41562-024-01881-0 - sourced from "Empowering Individuals: A Comprehensive Toolbox for Combating Online Misinformation", by Nicole Siller, Max Planck Institute, Media & Learning. Image credit: Rawpixel (public domain)