Best Practices for Debunking Misinformation: The Devil is in the Details

True or false?

Here are the facts:

The facts in the second box are supported by evidence from reputable sources: Scientific American, Johns Hopkins University, and the Mayo Clinic, respectively.

However, if you believe any of the false statements in the first box, it's unlikely that merely reading the facts will change your mind. Multiple laboratory studies on debunking, the process of correcting misinformation after it has generated false beliefs, demonstrate that it is extremely difficult to dislodge misinformation from the minds of those who have been exposed to it.4 In fact, corrections of misinformation that merely state the opposite of a false statement can even backfire, causing readers to double-down on their erroneous beliefs.5 Other research suggests that providing detailed counterarguments is a more effective strategy, as it allows for readers to create a new explanatory framework for their beliefs6,7,8.

Recognizing the complexities involved in correcting misinformation, researchers Man-pui Sally Chan, Christopher Jones, Kathleen Hall Jamieson, and Dolores Albarracin conducted a meta-analysis to determine the best strategies for debunking online falsehoods. Previous literature on debunking strategies suggested that for individuals to discard prior beliefs, they need to first build a new mental model 4 , or explanatory framework, to explain the phenomenon in question. Chan and her team hypothesized that debunking messages with more detail would allow readers to create more coherent mental models, thus enabling individuals to discard misinformation. After searching multiple databases, Chan and colleagues identified 52 experimental studies, representing a sample of 6,878 participants, that measured participants' beliefs in misinformation and debunking messages. The studies were drawn from cognitive and social psychology and political science. A possible limitation of this sample is that the studies did not account for participants' dispositional (e.g., political or religious) attitudes that can result in resistance toward revising previous belief.

Experimental Paradigm

The meta-analysis used a between-subjects design to investigate the effect sizes of the interventional treatments. Many studies in the meta-analysis used an experimental paradigm identical, or similar, to the Warehouse Fire Paradigm 9,10,11 Here's how it works:

  • After random assignment to experimental and control groups, an intervention follows. Participants in the experimental group read a report containing false information about the causes of a warehouse fire.
  • The experimental group is divided into two sub-groups. Half of the experimental participants read a debunking message (debunking group); the remaining half read no additional information (misinformation group).
  • Participants in the control group read some information about the fire, but did not read the false information about the fire and the debunking messages. Following the intervention, all participants answered survey questions about the fire.
Debunking experiments often use an experimental design similar to the Warehouse Fire Paradigm, illustrated above. Following an intervention and delay, researchers measure participants' references to misinformation messages and debunking messages.

Researchers then tallied participants' factual and inferential references to the false information and the debunking message. The mean number of references, by group, were used to calculate the three effect sizes in the meta-analysis: the misinformation effect — the difference between the mean scores of misinformation group and the control group; the debunking effect — the difference between the mean scores of the misinformation and debunking group; and the misinformation-persistence effect — the difference between the mean scores of the debunking group and the control group.  All three effect sizes were large, meaning that each effect represented a meaningful difference in participants' performance on the survey questions.

Moderators: Which study characteristics affected the outcomes?

The next step in the meta-analysis was to identify potential moderator variables, or characteristics across studies that may have influenced the misinformation, debunking and misinformation-persistence effects. The team identified three potential moderators: 1) whether there were explicit instructions or experimental settings likely to spontaneously encourage participants to think of (or elaborate upon) explanations for the fire that were consistent with the misinformation story; 2) whether there were explicit instructions or experimental settings likely to spontaneously encourage participants to generate counterarguments to the misinformation; and 3) the level of detail of the debunking message. The moderators had the following effects:

  • When participants were encouraged to elaborate on the misinformation message, the debunking effect was weaker, and the misinformation was more persistent
  • When participants were encouraged to elaborate on counterarguments to the misinformation message, the debunking effect was stronger.
  • More detailed debunking messages were associated with a stronger debunking effect.

Recommendations

How can these results be applied outside of the laboratory? Chan's team recommended that media and policy makers follow these practices:

The authors also encouraged policymakers to continue developing online alerting platforms, such as Snopes.com, which identify and correct misinformation.


Additional References:

  1. https://www.scientificamerican.com/article/covid-19-is-now-the-third-leading-cause-of-death-in-the- u-s1/
  2. https://www.hopkinsmedicine.org/health/conditions-and-diseases/coronavirus/covid-19-vaccines-myth-versus-fact
  3. https://www.mayoclinichealthsystem.org/hometown-health/featured-topic/covid-19-vaccine-myths-debunked
  4. Lewandowsky, S., Ecker, U. K., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing.Psychological Science in the Public Interest,13(3), 106-131.
  5. Schwarz, N., Sanna, L. J., Skurnik, I., & Yoon, C. (2007). Metacognitive experiences and the intricacies of setting people straight: Implications for debiasing and public information campaigns. In M. P. Zanna (Ed.), Advances in Experimental Social Psychology (Vol. 39, pp. 127–191). San Diego, CA: Academic Press.
  6. Jerit, J. (2008). Issue framing and engagement: Rhetorical strategy in public policy debates. Political Behavior, 30, 1–24.
  7. Johnson, H. M., & Seifert, C. M. (1994). Sources of the continued influence effect: When misinformation in memory affects later inferences. Journal of Experimental Psychology: Learning, Memory, and Cognition, 20, 1420– 1436.
  8. Wilkes, A. L., & Leatherbarrow, M. (1988). Editing episodic memory following the identification of error. Quarterly Journal of Experimental Psychology: Human Experimental Psychology, 40, 361–387.
  9. Ecker, U. K. H., Lewandowsky, S., Swire, B., & Chang, D. (2011). Correcting false information in memory: Manipulating the strength of misinformation encoding and its retraction. Psychonomic Bulletin & Review, 18, 570–578.
  10. Johnson, H. M., & Seifert, C. M. (1994). Sources of the continued influence effect: When misinformation in memory affects later inferences. Journal of Experimental Psychology: Learning, Memory, and Cognition, 20, 1420– 1436.

11.  Wilkes, A. L., & Leatherbarrow, M. (1988). Editing episodic memory following the identification of error. The Quarterly Journal of Experimental Psychology A, 40, 361– 387.