Development action with informed and engaged societies
As of March 15 2025, The Communication Initiative (The CI) platform is operating at a reduced level, with no new content being posted to the global website and registration/login functions disabled. (La Iniciativa de Comunicación, or CILA, will keep running.) While many interactive functions are no longer available, The CI platform remains open for public use, with all content accessible and searchable until the end of 2025. 

Please note that some links within our knowledge summaries may be broken due to changes in external websites. The denial of access to the USAID website has, for instance, left many links broken. We can only hope that these valuable resources will be made available again soon. In the meantime, our summaries may help you by gleaning key insights from those resources. 

A heartfelt thank you to our network for your support and the invaluable work you do.
Time to read
4 minutes
Read so far

The Psychological Drivers of Misinformation Belief and Its Resistance to Correction

0 comments
Affiliation

University of Western Australia (Ecker); University of Bristol (Lewandowsky); Monash University (Cook); University of Erfurt (Schmid); Vanderbilt University (Fazio); Harvard University (Brashier); Purdue University (Brashier); University of Minnesota (Kendeou, Vraga); Boston University (Amazeen)

Date
Summary

"The contemporary information landscape brings particular challenges: the internet and social media have enabled an exponential increase in misinformation spread and targeting to precise audiences..."

For decades, science communication has operated under the assumption that a thorough and accessible explanation of facts should overcome the impact of misinformation. However, some individuals deny the existence of climate change or reject vaccinations despite being aware of a scientific consensus to the contrary. Thus, as this article contends, understanding and countering misinformation requires considering the cognitive architecture and context of individual decision makers. It describes the cognitive, social, and affective factors that lead people to form or endorse misinformed views and the psychological barriers to knowledge revision after misinformation has been corrected. It provides guidance on countering misinformation, including educational and preemptive interventions, refutations, and psychologically informed technological solutions. Finally, it looks at broader societal trends that have contributed to the rise of misinformation and discusses practical implications.

The paper opens by examining drivers of false beliefs, which generally arise through the same mechanisms that establish accurate beliefs. The drivers include: cognitive factors, such as use of intuitive thinking and memory failures; social factors, such as reliance on source cues to determine truth; and affective factors, such as the influence of mood on credulity or the use of emotional language (e.g., by anti-vaccination activists; emotion can be persuasive because it distracts readers from potentially more diagnostic cues, such as source credibility).

As outlined here, the information deficit model holds that false beliefs can easily be corrected by providing relevant facts. However, misinformation can often continue to influence people's thinking even after they receive a correction and accept it as true. This so-called continued influence effect (CIE) might be an influential factor in the persistence of beliefs such as the one that there is a link between vaccines and autism, despite strong evidence discrediting this link. Theoretical accounts of the CIE draw heavily on models of memory: When information is encoded into memory and then new information that discredits it is learned, the original information is not simply erased or replaced. Misinformation that a vaccine has caused an unexpectedly large number of deaths might be incorporated with knowledge related to diseases, vaccinations, and causes of death. A subsequent correction that the information about vaccine-caused deaths was inaccurate will also be added to memory and is likely to result in some knowledge revision. However, the misinformation will remain in memory and can potentially be reactivated and retrieved later on.

Social and affective mechanisms influence the CIE. One socio-affective factor is source credibility, the perceived trustworthiness and expertise of the sources providing the misinformation and correction. Another is a person's values and belief system that ground their personal and sociocultural identity. Corrections attacking a person's worldview can be ineffective or backfire. For example, if a message is appraised as an identity threat (for example, a person identifying as an anti-vaxxer perceiving a correction that the risks of a vaccine do not outweigh the risks of a disease as an identity threat), intense negative emotions can arise that motivate strategies such as discrediting the source of the correction, ignoring the worldview-inconsistent evidence, or selectively focusing on worldview-bolstering evidence. However, there is a developing consensus that even worldview-inconsistent corrections typically have some beneficial impact.

There is evidence that corrections can benefit from emotional recalibration. For example, when misinformation downplays a risk or threat - for example, misinformation that a serious disease is relatively harmless - corrections that provide a more accurate risk evaluation operate partly through their impact on emotions such as hope, anger, and fear. This emotional mechanism might help correction recipients realign their understanding of the situation with reality - for example, to realise they have underestimated the real threat. Likewise, countering disinformation that seeks to fuel fear or anger can benefit from a downward adjustment of emotional arousal; for example, refutations of vaccine misinformation can reduce anti-vaccination attitudes by mitigating misinformation-induced anger.

More generally, two strategies that can be distinguished for combatting misinformation are preemptive intervention (prebunking) and reactive intervention (debunking). The effectiveness of these corrections is influenced by a range of factors, and there are mixed results regarding their relative efficacy. In the case of anti-vaccination conspiracy theories, prebunking has been found to be more effective than debunking. Many prebunking interventions draw on inoculation theory, which applies the principle of vaccination to knowledge, positing that "inoculating" people with a weakened form of persuasion can build immunity against subsequent persuasive arguments by engaging people's critical-thinking skills. Inoculation theory has also been used to explain how strategies designed to increase information literacy and media literacy could reduce the effects of misinformation.

However, it is often specific pieces of misinformation that cause concern, which call for more specific responses than prebunking can offer. That is where debunking, wherein reactive interventions retrospectively target concrete instances of misinformation, comes into play. For example, if a falsehood that a vaccine can lead to life-threatening side effects in pregnant women begins to spread, this misinformation must be addressed using specific counter-evidence. Numerous best practices for debunking have emerged; six of them are reviewed here. To cite one: Corrections should be paired with relevant social norms, including injunctive norms ("protecting the vulnerable by getting vaccinated is the right thing to do") and descriptive norms ("over 90% of parents are vaccinating their children"), as well as expert consensus ("doctors and medical societies around the world agree that vaccinations are important and safe"). However, research has shown that even well-designed debunking interventions might not have long-lasting effects.

After considering misinformation corrections in social media contexts, the paper looks at implications for practitioners in various fields - journalists, legislators, public health officials, and healthcare workers - as well as information consumers. In brief:

  • Practitioners: Different strategies for countering misinformation are available to practitioners at different time points, as illustrated in Figure 6 of the paper. Suggestions are provided. For example, those engaged in debunking should provide a plausible alternative cause for an event or factual details, preface the misinformation with a warning, explain any logical fallacies or persuasive techniques used to promote the misinformation, and end with a factual statement.
  • Information consumers: Thoughtless sharing can amplify misinformation that might confuse and deceive others. Thus, while engaged with content, individuals should slow down, think about why they are engaging, and carefully and thoughtfully question their visceral response.
  • Policymakers: It is important to scrutinise whether the practices and algorithms of media platforms are optimised to promote misinformation or truth. Policymakers should consider enhanced regulation - while avoiding censorship. ("However, freedom of speech does not include the right to amplification of that speech.") Other tools include: supporting a diverse media landscape and adequately funding independent public broadcasters; investing in education, particularly to build information literacy skills in schools; and developing interventions targeted more directly at behaviour, such as nudging policies and public pledges to honour the truth.

"Overall, solutions to misinformation spread must be multipronged and target both the supply (for example, more efficient fact-checking and changes to platform algorithms and policies) and the consumption (for example, accuracy nudges and enhanced media literacy) of misinformation."

The paper concludes by suggesting directions for future research that might help us fully understand the psychology of misinformation. For example: "Future empirical and theoretical work would benefit from development of an overarching theoretical model that aims to integrate cognitive, social and affective factors, for example by utilizing agent-based modelling approaches. This approach might also offer opportunities for more interdisciplinary work...at the intersection of psychology, political science...and social network analysis..., and the development of a more sophisticated psychology of misinformation."

Source

Nature Reviews Psychology volume 1, pages 13-29 (2022). https://doi.org/10.1038/s44159-021-00006-y.