Correction to: Nature Human Behaviour https://doi.org/10.1038/s41562-023-01623-8, published online 15 June 2023.
In the version of the article originally published, the reported numbers of included studies and effect sizes were incorrect. For some of the synthesized reports, we also identified errors in our effect size estimation, coding inconsistencies, and imprecise effect size estimates.
With respect to the study and effect size numbers, we had incorrectly used the same study number (#38) for two different reports (Vraga, Kim & Cook, 2019; van Stekelenburg et al., 2021 Study 1)1,2. As a result, the reported number of 74 studies was incorrect. The correct number is 75. In the total N of effect sizes (N = 60,861), we originally included 61 records with no effect sizes, which were not part of the final analyses. The correct N is 53,320.
With respect to our handling of the data, we had incorrectly estimated effect sizes for five articles. For Ecker & Ang (2019)3, the standard error rather than the standard deviation was used. Similarly, for Jacobson (2022)4, the wrong mean values were used. For Golding et al. (1990)5, a wrong equation was used to estimate pooled standard deviations. For Vraga, Kim & Cook (2019)1, we did not convert the standard error to the standard deviation correctly. Finally, for Swire-Thompson et al. (2022)6, we incorrectly typed the multiplication operator in the equation as exponentiation. These errors have now been corrected.
Other coding decisions were incorrect as well. Specifically, for Ecker & Rodricks, 20207; Vijaykumar et al., 20218; Calabrese & Albarracin, 20219; Carey et al., 202210; Lanius et al., 202111; Shi et al., 202012, we reported a single effect size instead of separate effect sizes by gender, age and geographical region, which would have been the correct coding procedure. Correcting these errors resulted in 9 additional records and 10 additional effect sizes. Moreover, we did not code all available correction and misinformation-persistence effect sizes from each report for the following articles: Anderson, Lepper & Ross, 198013; Anderson, New & Speer, 198514; Davies, 199715; Guenther & Alicke, 200816; Wright et al., 199617; Wyer & Budesheim, 198718; Wyer & Unverzagt, 198519; De keersmaecker & Roets, 201720; Ferrero et al., 202021; van Stekelenburg et al., 20212; Bode, Vraga & Tully, 202122, Smith & Seitz, 201923; Winters et al., 202124, Kuru et al., 202125; Ecker, 202226; Yousuf et al., 202127; Maertens, Anseel & van der Linden, 202028; Pluviano, Watt & Sala, 201729; Pluviano et al., 202230. Correctly adding these effect sizes resulted in 25 additional records and 30 additional effect sizes. Finally, we had imprecisely estimated the effects for Andrews, 202131; Vraga et al., 202032; Vraga & Bode, 201833 because we did not check all possible sources, including tables, figures, appendices and shared data repositories to extract the most precise statistics for effect size estimations. These errors were also corrected, resulting in changes to 8 effect sizes (0 additional records and 0 additional effect sizes) (i.e., the original version of the article: ds from −0.17 to 4.11 and the newly calculated statistics: ds from −0.12 to 0.75).
Some moderators in those papers were also incorrect, which led to 68 changes in the moderator codes in these cases. We specified changes to codes for negative misinformation, detailed corrections, misinformation domain, fictitious issue, and the method of effect-size estimation. For negative misinformation, we corrected Code 1 to 2 for Calabrese & Albarracin, 2021 Studies 2–79, De keersmaecker & Roets 201720, van Stekelenburg et al., 2021 Study 22, and Yousuf et al., 202127; and Code 2 to 1 for Vijaykumar et al., 20218. For detailed corrections, we corrected Code 1 to 2 for Calabrese & Albarracin, 2021 Study 29 and Shi et al. 202012. For domain of misinformation, we corrected Code –1 to 1 for Ecker & Rodricks, 20207, Wyer & Budesheim, 198718, Wyer & Unverzagt, 198519, De keersmaecker & Roets, 201720; Code 2 to 3 for Calabrese & Albarracin, 2021 Studies 1–79, Bode, Vraga & Trully, 202122, Lanius, Weber & MacKenzie, 202111, Vijaykumar et al., 20218, and Andrews 202131; Code 2 to –1 for Carey et al., 202210 and Winters et al., 202124; and Code 3 to –1 for van Stekelenburg et al., 2021 Study 12, Maertens, Anseel & van der Linden, 202028, and Vraga et al., 202032. For fictitious issue, we corrected Code 1 to –1 for Shi et al., 202012 and Maertens, Anseel & van der Linden, 202028; and Code –1 to 1 for Pluviano et al., 2022 Study 130. For the method of effect-size estimation, we corrected Code 2 to 1 for Anderson, New & Speer, 198514; Davies, 199715, and Wright et al., 199617.
The re-analyses of the corrected dataset, including the procedures to detect influential cases and all bias analyses, necessitated correction of statistical information in the text and in tables (i.e., Tables 1–4). All changes in the tables were due to the aforementioned changes in the number of effect sizes and the effect sizes themselves in the corrected dataset. Figures 2– 4 have also been corrected to reflect these changes.
The original findings reported in the published article remain largely unaffected, except the following: the effect of detailed corrections is no longer statistically significant (i.e., the original version of the article: P = 0.020 and the newly calculated statistics: P = 0.316 on pages 1516, 1517); there is also no evidence for an effect of fictitious issues (which in the original version of the article had been reported as “marginal” on page 1516).
On page 1522, the number of outlying debunking effects (i.e., from four to six outliers) and the outliers’ cut-offs changed (i.e., from d < −7.00 or d > 4.67 to d < −2.37 or d > 2.61) due to the aforementioned changes in the number of effect sizes and the effect sizes themselves in the corrected dataset (i.e., the original version of the article reported 205 effect sizes from 74 research reports and the debunking effect to be d = 0.19, P = 0.131, 95% CI −0.06 to 0.43; the correct data set included 245 effect sizes from 75 research reports and the debunking effect to be d = 0.11, P = 0.142, 95% CI −0.04 to 0.26).
We also rectified other errors. We corrected an incorrect example for the misinformation domains on page 1521 (i.e., the original version of the article indicated “climate change” but it should be “genetically modified foods”). Also, on page 1520, we previously stated “As only two reports used a mixed-subjects design and provided sufficient data (that is, means and s.d. of the control groups at timepoints 1 and 2), the raters followed a within-subject effect-size equation to calculate the effect sizes.” However, this statement was wrong because the appropriate between- and within-subjects equations had both been used. Therefore, this sentence was deleted. On page 1522, we corrected a wrong reference for the three-parameter selection method34 (i.e., the original version of the article cited 12435 but it should be 12334.) Finally, on page 1517, we had incorrectly reported the debunking effect instead of the correction effect as stated. Therefore, we now report the correction effect and have changed the parenthetical comment to “(d = 0.19, P < 0.131, 95% CI −0.06 to 0.43).”
References
Vraga, E. K., Kim, S. C. & Cook, J. Testing logic-based and humor-based corrections for science, health, and political misinformation on social media. J. Broadcast Electron. Media 63, 393–414 (2019).
van Stekelenburg, A., Schaap, G., Veling, H. & Buijzen, M. Boosting understanding and identification of scientific consensus can help to correct false beliefs. Psychol. Sci. 32, 1549–1565 (2021).
Ecker, U. K. H. & Ang, L. C. Political attitudes and the processing of misinformation corrections. Polit. Psychol. 40, 241–260 (2019).
Jacobson, N. G. What does climate change look like to you? The role of internal and external representations in facilitating conceptual change about the weather and climate distinction. (University of Southern California, 2022).
Golding, J. M., Fowler, S. B., Long, D. L. & Latta, H. Instructions to disregard potentially useful information: The effects of pragmatics on evaluative judgments and recall. J. Mem. Lang. 29, 212–227 (1990).
Swire-Thompson, B., Miklaucic, N., Wihbey, J. P., Lazer, D. & DeGutis, J. The Backfire Effect after correcting misinformation is strongly associated with reliability. J. Exp. Psychol. Gen. 151, 1655–1665 (2022).
Ecker, U. K. H. & Rodricks, A. E. Do false allegations persist? Retracted misinformation does not continue to influence explicit person impressions. J. Appl. Res. Mem. Cogn. 9, 587–601 (2020).
Vijaykumar, S. et al. How shades of truth and age affect responses to COVID-19 (Mis)information: randomized survey experiment among WhatsApp users in UK and Brazil. Humanit. Soc. Sci. Commun. 8, 1–12 (2021).
Calabrese, C. & Albarracin, D. Unpublished data.
Carey, J. M. et al. The ephemeral effects of fact-checks on COVID-19 misperceptions in the United States, Great Britain and Canada. Nat. Hum. Behav. 6, 236–243 (2022).
Lanius, C., Weber, R. & MacKenzie, W. I. Use of bot and content flags to limit the spread of misinformation among social networks: a behavior and attitude survey. Soc. Netw. Anal. Min. 11, 1–15 (2021).
Shi, R., Feldman, R., Liu, J. & Clark, P. I. The dilemma of correcting nicotine misperceptions: nicotine replacement therapy versus electronic cigarettes. Health Commun. 36, 1856–1866, https://doi.org/10.1080/10410236.2020.1800288 (2020).
Anderson, C. A., Lepper, M. R. & Ross, L. Perseverance of social theories: The role of explanation in the persistence of discredited information. J. Pers. Soc. Psychol. 39, 1037–1049 (1980).
Anderson, C. A., New, B. L. & Speer, J. R. Argument availability as a mediator of social theory perseverance. Soc. Cogn. 3, 235–249 (1985).
Davies, M. F. Belief persistence after evidential discrediting: The impact of generated versus provided explanations on the likelihood of discredited outcomes. J. Exp. Soc. Psychol. 33, 561–578 (1997).
Guenther, C. L. & Alicke, M. D. Self-enhancement and belief perseverance. J. Exp. Soc. Psychol. 44, 706–712 (2008).
Wright, E. F., Christie, S. D., Johnson, R. W. & Stoffer, E. S. The impact of group discussion on the theory-perseverance bias. J. Soc. Psychol. 136, 85–98 (1996).
Wyer, R. S. & Budesheim, T. L. Person memory and judgments: The impact of information that one is told to disregard. J. Pers. Soc. Psychol. 53, 14–29 (1987).
Wyer, R. S. & Unverzagt, W. H. Effects of instructions to disregard information on its subsequent recall and use in making judgments. J. Pers. Soc. Psychol. 48, 533–549 (1985).
De keersmaecker, J. & Roets, A. ‘Fake news’: Incorrect, but hard to correct. The role of cognitive ability on the impact of false information on social impressions. Intelligence 65, 107–110 (2017).
Ferrero, M., Hardwicke, T. E., Konstantinidis, E. & Vadillo, M. A. The effectiveness of refutation texts to correct misconceptions among educators. J. Exp. Psychol. Appl. 26, 411–421 (2020).
Bode, L., Vraga, E. K. & Tully, M. Correcting misperceptions about genetically modified food on social media: Examining the impact of experts, social media heuristics, and the gateway belief model. Sci. Commun. 43, 225–251 (2021).
Smith, C. N. & Seitz, H. H. Correcting misinformation about neuroscience via social media. Sci. Commun. 41, 790–819 (2019).
Winters, M. et al. Debunking highly prevalent health misinformation using audio dramas delivered by WhatsApp: evidence from a randomised controlled trial in Sierra Leone. BMJ Glob. Health 6, 6954 (2021).
Kuru, O. et al. The effects of scientific messages and narratives about vaccination. PLoS One 16, e0248328 (2021).
Ecker, U. K. H. Correcting vaccine misinformation. OSF https://osf.io/dwyma/ (2022).
Yousuf, H. et al. A media intervention applying debunking versus non-debunking content to combat vaccine misinformation in elderly in the Netherlands: a digital randomised trial. EClinicalMedicine 35, 100881 (2021).
Maertens, R., Anseel, F. & van der Linden, S. Combatting climate change misinformation: Evidence for longevity of inoculation and consensus messaging effects. J. Environ. Psychol. 70, 101455 (2020).
Pluviano, S., Watt, C. & Sala, S. Della Misinformation lingers in memory: Failure of three pro-vaccination strategies. PLoS One 12, 15 (2017).
Pluviano, S., Watt, C., Pompéia, S., Ekuni, R. & Della Sala, S. Forming and updating vaccination beliefs: does the continued effect of misinformation depend on what we think we know? Cogn. Process. 23, 367–378 (2022).
Andrews, E. A. Combating COVID-19 vaccine conspiracy theories: Debunking misinformation about vaccines, Bill Gates, 5G, and microchips using enhanced correctives. ProQuest Dissertations and Theses (Ann Arbor, 2021).
Vraga, E. K., Kim, S. C., Cook, J. & Bode, L. Testing the effectiveness of correction placement and type on Instagram. Int. J. Press/Politics 25, 632–652 (2020).
Vraga, E. K. & Bode, L. I do not believe you: how providing a source corrects health misperceptions across social media platforms. Inf. Commun. Soc. 21, 1337–1353 (2018).
Pustejovsky, J. E. & Rodgers, M. A. Testing for funnel plot asymmetry of standardized mean differences. Res. Synth. Methods 10, 57–71 (2019).
Maier, M., Bartoš, F. & Wagenmakers, E. J. Robust Bayesian meta-analysis: addressing publication bias with model-averaging. Psychol. Methods https://doi.org/10.1037/MET0000405 (2022).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Chan, Mp.S., Albarracín, D. Author Correction: A meta-analysis of correction effects in science-relevant misinformation. Nat Hum Behav 9, 1992–1994 (2025). https://doi.org/10.1038/s41562-025-02294-3
Published:
Issue date:
DOI: https://doi.org/10.1038/s41562-025-02294-3