Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Perspective
  • Published:

How behavioural sciences can promote truth, autonomy and democratic discourse online

Abstract

Public opinion is shaped in significant part by online content, spread via social media and curated algorithmically. The current online ecosystem has been designed predominantly to capture user attention rather than to promote deliberate cognition and autonomous choice; information overload, finely tuned personalization and distorted social cues, in turn, pave the way for manipulation and the spread of false information. How can transparency and autonomy be promoted instead, thus fostering the positive potential of the web? Effective web governance informed by behavioural research is critically needed to empower individuals online. We identify technologically available yet largely untapped cues that can be harnessed to indicate the epistemic quality of online content, the factors underlying algorithmic decisions and the degree of consensus in online debates. We then map out two classes of behavioural interventions—nudging and boosting— that enlist these cues to redesign online environments for informed and autonomous choice.

This is a preview of subscription content, access via your institution

Access options

Buy this article

USD 39.95

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Challenges in automatically curated environments and on social media platforms.
Fig. 2: Nudging interventions that modify online environments.
Fig. 3: Illustrations of boosting interventions as they could appear within an online environment or as external tools.

Similar content being viewed by others

References

  1. Simon, H.A. Designing organizations for an information-rich world. Computers, Communications and the Public Interest (ed. Greenberger, M.) 37–72 (1971).

  2. Newman, N., Fletcher, R., Kalogeropoulos, A. & Nielsen, R. Reuters Institute Digital News Report 2019 https://ora.ox.ac.uk/objects/uuid:18c8f2eb-f616-481a-9dff-2a479b2801d0 (Reuters Institute for the Study of Journalism, 2019).

  3. Kosinski, M., Stillwell, D. & Graepel, T. Private traits and attributes are predictable from digital records of human behavior. Proc. Natl Acad. Sci. USA 110, 5802–5805 (2013).

    CAS  Google Scholar 

  4. Boerman, S. C., Kruikemeier, S. & Zuiderveen Borgesius, F. J. Online behavioral advertising: a literature review and research agenda. J. Advert 46, 363–376 (2017).

    Google Scholar 

  5. Ruths, D. & Pfeffer, J. Social media for large studies of behavior. Science 346, 1063–1064 (2014).

    CAS  Google Scholar 

  6. Tufekci, Z. Engineering the public: big data, surveillance and computational politics. First Monday https://doi.org/10.5210/fm.v19i7.4901 (2014).

    Article  Google Scholar 

  7. Harris, T. How technology is hijacking your mind—from a magician and Google design ethicist. Thrive Global https://thriveglobal.com/stories/how-technology-is-hijacking-your-mind-from-a-magician-and-google-design-ethicist/ (18 May 2016).

  8. Persily, N. The 2016 US election: can democracy survive the internet? J. Democracy 28, 63–76 (2017).

    Google Scholar 

  9. Habermas, J. The Structural Transformation of the Public Sphere: An Inquiry into a Category of Bourgeois Society. (MIT Press, 1991).

  10. Vosoughi, S., Roy, D. & Aral, S. The spread of true and false news online. Science 359, 1146–1151 (2018).

    CAS  Google Scholar 

  11. Mocanu, D., Rossi, L., Zhang, Q., Karsai, M. & Quattrociocchi, W. Collective attention in the age of (mis) information. Comput. Human Behav. 51, 1198–1204 (2015).

    Google Scholar 

  12. Rich, M.D. Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life. (RAND Corporation, 2018).

  13. Vargo, C. J., Guo, L. & Amazeen, M. A. The agenda-setting power of fake news: A big data analysis of the online media landscape from 2014 to 2016. New Media Soc. 20, 2028–2049 (2018).

    Google Scholar 

  14. Lazer, D. M. J. et al. The science of fake news. Science 359, 1094–1096 (2018).

    CAS  Google Scholar 

  15. Baldassarri, D. & Gelman, A. Partisans without constraint: political polarization and trends in American public opinion. Am. J. Sociol. 114, 408–446 (2008).

    Google Scholar 

  16. Abramowitz, A. I. & Saunders, K. L. Is polarization a myth? J. Polit. 70, 542–555 (2008).

    Google Scholar 

  17. McCarty, N., Poole, K.T. & Rosenthal, H. Polarized America: the Dance of Ideology and Unequal Riches. (MIT Press, 2006).

  18. Fiorina, M. P. & Abrams, S. J. Political polarization in the American public. Annu. Rev. Polit. Sci. 11, 563–588 (2008).

    Google Scholar 

  19. McCright, A. M. & Dunlap, R. E. The politicization of climate change and polarization in the American public’s views of global warming, 2001–2010. Sociol. Q. 52, 155–194 (2011).

    Google Scholar 

  20. Cota, W. et al. Quantifying echo chamber effects in information spreading over political communication networks. EPJ Data Sci. 8, 35 (2019).

    Google Scholar 

  21. DiMaggio, P., Evans, J. & Bryson, B. Have American’s social attitudes become more polarized? Am. J. Sociol. 102, 690–755 (1996).

    Google Scholar 

  22. Fletcher, R., Cornia, A., Graves, L., & Nielsen, R. K., Measuring the reach of “fake news” and online disinformation in Europe. Reuters Institute Digital News Publication. http://www.digitalnewsreport.org/publications/2018/measuring-reach-fake-news-online-disinformation-europe/ (2018).

  23. Cinelli, M., Cresci, S., Galeazzi, A., Quattrociocchi, W. & Tesconi, M. The limited reach of fake news on Twitter during 2019 European elections. Preprint at arXiv https://arxiv.org/abs/1911.12039 (2020).

  24. Guess, A. M., Nyhan, B. & Reifler, J. Exposure to untrustworthy websites in the 2016 US election. Nat. Hum. Behav. https://doi.org/10.1038/s41562-020-0833-x (2020).

    Article  Google Scholar 

  25. Barberá, P., Jost, J. T., Nagler, J., Tucker, J. A. & Bonneau, R. Tweeting from left to right: is online political communication more than an echo chamber? Psychol. Sci. 26, 1531–1542 (2015).

    Google Scholar 

  26. Evans, J. H. Have Americans’ attitudes become more polarized?—An update. Soc. Sci. Q. 84, 71–90 (2003).

    Google Scholar 

  27. Lelkes, Y. Mass polarization: manifestations and measurements. Public Opin. Q. 80, 392–410 (2016).

    Google Scholar 

  28. Del Vicario, M. et al. The spreading of misinformation online. Proc. Natl Acad. Sci. USA 113, 554–559 (2016).

    Google Scholar 

  29. Watts, D. J. Should social science be more solution-oriented? Nat. Hum. Behav. 1, 15 (2017).

    Google Scholar 

  30. Larson, H. J. The biggest pandemic risk? Viral misinformation. Nature 562, 309–310 (2018).

    CAS  Google Scholar 

  31. Sundar, S. The MAIN model: a heuristic approach to understanding technology effects on credibility. in Digital Media, Youth, and Credibility (eds Metzger, M. J. & Flanagin, A. J.) 73–100 (MIT Press, 2007).

  32. Gigerenzer, G., Hertwig, R. & Pachur, T. Heuristics: The Foundations of Adaptive Behavior (Oxford University Press, 2011).

  33. de Freitas Melo, P., Vieira, C.C., Garimella, K., de Melo, P.O.V. & Benevenuto, F. Can WhatsApp counter misinformation by limiting message forwarding? in International Conference on Complex Networks and Their Applications 372–384 (2019).

  34. Baron-Cohen, S. Keynote address at ADL’s 2019 Never Is Now Summit on anti-Semitism and hate. Anti-Defamation League https://www.adl.org/news/article/sacha-baron-cohens-keynote-address-at-adls-2019-never-is-now-summit-on-anti-semitism (Accessed 7 December 2019).

  35. Kozyreva, A., Herzog, S., Lorenz-Spreen, P., Hertwig, R. & Lewandowsky, S. Artificial Intelligence in Online Environments: Representative Survey of Public Attitudes in Germany (Max Planck Institute for Human Development, 2020).

  36. Smith, A. Public Attitudes Toward Computer Algorithms (Pew Research Center, 2018).

  37. Pennycook, G. et al. Understanding and reducing the spread of misinformation online. Preprint at PsyArXiv https://psyarxiv.com/3n9u8/ (2019).

  38. Zuboff, S. Surveillance capitalism and the challenge of collective action. New Labor Forum 28, 10–29 (2019).

    Google Scholar 

  39. Klein, D., & Wueller, J. Fake news: a legal perspective. J. Internet Law https://ssrn.com/abstract=2958790 (2017).

  40. Assemblée Nationale. Proposition de loi relative à la lutte contre la manipulation de l’information, No. 799 [Proposed Bill on the Fight Against the Manipulation of Information, No. 799] http://www.assemblee-nationale.fr/15/ta/tap0190.pdf (Accessed 26 June 2019).

  41. van Ooijen, I. & Vrabec, H. U. Does the GDPR enhance consumers’ control over personal data? An analysis from a behavioural perspective. J. Consum. Policy 42, 91–107 (2019).

    Google Scholar 

  42. Nouwens, M., Liccardi, I., Veale, M., Karger, D. & Kagal, L. Dark patterns after the GDPR: scraping consent pop-ups and demonstrating their influence. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 1–13 https://doi.org/10.1145/3313831.3376321 (2020).

  43. Hertwig, R. When to consider boosting: some rules for policy-makers. Behav. Public Policy 1, 143–161 (2017).

    Google Scholar 

  44. Epstein, Z., Pennycook, G. & Rand, D. Will the crowd game the algorithm? Using layperson judgments to combat misinformation on socialmedia by downranking distrusted sources. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 1–11 https://doi.org/10.1145/3313831.3376232 (2020).

  45. Britt, M. A., Rouet, J. F., Blaum, D. & Millis, K. A reasoned approach to dealing with fake news. Policy Insights Behav. Brain Sci. 6, 94–101 (2019).

    Google Scholar 

  46. Thaler, R.H. & Sunstein. C. R. Nudge: Improving Decisions about Health, Wealth, and Happiness (Yale University Press, 2008)

  47. Hertwig, R. & Grüne-Yanoff, T. Nudging and boosting: steering or empowering good decisions. Perspect. Psychol. Sci. 12, 973–986 (2017).

    Google Scholar 

  48. Griffiths, K. M. & Christensen, H. Website quality indicators for consumers. J. Med. Internet Res. 7, e55 (2005).

    Google Scholar 

  49. Nickel, M., Murphy, K., Tresp, V. & Gabrilovich, E. A review of relational machine learning for knowledge graphs. Proc. IEEE 104, 11–33 (2015).

    Google Scholar 

  50. Dong, X. et al. Knowledge Vault: a web-scale approach to probabilistic knowledge fusion. in Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 601–610 (2014).

  51. Shu, K., Sliva, A., Wang, S., Tang, J. & Liu, H. Fake news detection on social media: A data mining perspective. SIGKDD Explor. 19, 22–36 (2017).

    Google Scholar 

  52. Klašnja, M., Barberá, P., Beauchamp, N., Nagler, J. & Tucker, J. Measuring public opinion with social media data. in The Oxford Handbook of Polling and Survey Methods (eds Atkeson, L. R. & Alvarez, R. M.) https://www.oxfordhandbooks.com/view/10.1093/oxfordhb/9780190213299.001.0001/oxfordhb-9780190213299-e-3 (2017).

  53. Dong, X. L. et al. Knowledge-based trust: Estimating the trustworthiness of web sources. Proceedings VLDB Endowment 8, 938–949 (2015).

    Google Scholar 

  54. Hull, J. Google Hummingbird: where no search has gone before. Wired https://www.wired.com/insights/2013/10/google-hummingbird-where-no-search-has-gone-before/ (accessed: 9 July 2019).

  55. Luo, H., Liu, Z., Luan, H. & Sun, M. Online learning of interpretable word embeddings. In Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, 1687–1692 (2015).

  56. Schmidt, A. & Wiegand, M. A survey on hate speech detection using natural language processing. In Proceedings of the Fifth International Workshop on Natural Language Processing for Social Media, 1–10 (2017).

  57. Schmitt, J. B., Rieger, D., Rutkowski, O. & Ernst, J. Counter-messages as prevention or promotion of extremism?! The potential role of YouTube: recommendation algorithms. J. Commun. 68, 780–808 (2018).

    Google Scholar 

  58. Arno, A. & Thomas, S. The efficacy of nudge theory strategies in influencing adult dietary behaviour: a systematic review and meta-analysis. BMC Public Health 16, 676 (2016).

    Google Scholar 

  59. Kurvers, R. H. et al. Boosting medical diagnostics by pooling independent judgments. Proc. Natl Acad. Sci. USA 113, 8777–8782 (2016).

    CAS  Google Scholar 

  60. Lusardi, A. & Mitchell, O. S. The economic importance of financial literacy: theory and evidence. J. Econ. Lit. 52, 5–44 (2014).

    Google Scholar 

  61. Roozenbeek, J. & van der Linden, S. Fake news game confers psychological resistance against online misinformation. Palgrave Commun. 5, 65 (2019).

    Google Scholar 

  62. Pennycook, G. & Rand, D. G. Lazy, not biased: susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition 188, 39–50 (2019).

    Google Scholar 

  63. Hilbert, M. & López, P. The world’s technological capacity to store, communicate, and compute information. Science 332, 60–65 (2011).

    CAS  Google Scholar 

  64. Rosa, H. Social Acceleration: A New Theory of Modernity. (Columbia University Press, 2013).

  65. Lorenz-Spreen, P., Mønsted, B. M., Hövel, P. & Lehmann, S. Accelerating dynamics of collective attention. Nat. Commun. 10, 1759 (2019).

    Google Scholar 

  66. Wu, F. & Huberman, B. A. Novelty and collective attention. Proc. Natl Acad. Sci. USA 104, 17599–17601 (2007).

    CAS  Google Scholar 

  67. Hills, T. T., Noguchi, T. & Gibbert, M. Information overload or search-amplified risk? Set size and order effects on decisions from experience. Psychon. Bull. Rev. 20, 1023–1031 (2013).

    Google Scholar 

  68. Hills, T. T. The dark side of information proliferation. Perspect. Psychol. Sci. 14, 323–330 (2019).

    Google Scholar 

  69. American Society of News Editors (ASNE). ASNE statement of principles. ASNE.org https://www.asne.org/content.asp?pl=24&sl=171&contentid=171 (accessed 27 May 2019).

  70. Epstein, R. & Robertson, R. E. The search engine manipulation effect (SEME) and its possible impact on the outcomes of elections. Proc. Natl Acad. Sci. USA 112, E4512–E4521 (2015).

    CAS  Google Scholar 

  71. Lazer, D. The rise of the social algorithm. Science 348, 1090–1091 (2015).

    CAS  Google Scholar 

  72. Resnick, P. & Varian, H. R. Recommender systems. Commun. ACM 40, 56–58 (1997).

    Google Scholar 

  73. Bakshy, E., Messing, S. & Adamic, L. A. Exposure to ideologically diverse news and opinion on Facebook. Science 348, 1130–1132 (2015).

    CAS  Google Scholar 

  74. Martens, Be., Aguiar, L., Gomez-Herrera, E. & Mueller-Langer, F. The digital transformation of news media and the rise of disinformation and fake news. Digital Economy Working Paper 2018–02, Joint Research Centre Technical Reports. https://ssrn.com/abstract=3164170 (2018).

  75. Cosley, D., Lam, S.K., Albert, I., Konstan, J.A. & Riedl, J. Is seeing believing? How recommender system interfaces affect users’ opinions. In Proceedings of the SIGCHI conference on Human factors in computing systems 585–592 (2003).

  76. Pan, B. et al. In Google we trust: users’ decisions on rank, position, and relevance. J. Comput. Mediat. Commun. 12, 801–823 (2007).

    Google Scholar 

  77. Bozdag, E. Bias in algorithmic filtering and personalization. Ethics Inf. Technol. 15, 209–227 (2013).

    Google Scholar 

  78. Sunstein, C.R. Republic.com. (Princeton University Press, 2002).

  79. Chakraborty, A., Ghosh, S., Ganguly, N. & Gummadi, K.P. Optimizing the recency-relevancy trade-off in online news recommendations. In Proceedings of the 26th International Conference on World Wide Web 837–846 (2017).

  80. Zuboff, S. Big other: surveillance capitalism and the prospects of an information civilization. J. Inf. Technol. 30, 75–89 (2015).

    Google Scholar 

  81. Matz, S. C., Kosinski, M., Nave, G. & Stillwell, D. J. Psychological targeting as an effective approach to digital mass persuasion. Proc. Natl Acad. Sci. USA 114, 12714–12719 (2017).

    CAS  Google Scholar 

  82. Youyou, W., Kosinski, M. & Stillwell, D. Computer-based personality judgments are more accurate than those made by humans. Proc. Natl Acad. Sci. USA 112, 1036–1040 (2015).

    CAS  Google Scholar 

  83. Ortiz-Ospina, E. The rise of social media. Our World in Data https://ourworldindata.org/rise-of-social-media (accessed: 5 December 2019).

  84. Porten-Cheé, P. & Eilders, C. The effects of likes on public opinion perception and personal opinion. Communications https://doi.org/10.1515/commun-2019-2030 (2019).

    Article  Google Scholar 

  85. Dandekar, P., Goel, A. & Lee, D. T. Biased assimilation, homophily, and the dynamics of polarization. Proc. Natl Acad. Sci. USA 110, 5791–5796 (2013).

    CAS  Google Scholar 

  86. Lee, E. et al. Homophily and minority-group size explain perception biases in social networks. Nat. Hum. Behav. 3, 1078–1087 (2019).

    Google Scholar 

  87. Stewart, A. J. et al. Information gerrymandering and undemocratic decisions. Nature 573, 117–121 (2019).

    CAS  Google Scholar 

  88. Ross, L., Greene, D. & House, P. The “false consensus effect”: an egocentric bias in social perception and attribution processes. J. Exp. Soc. Psychol. 13, 279–301 (1977).

    Google Scholar 

  89. Colleoni, E., Rozza, A. & Arvidsson, A. Echo chamber or public sphere? Predicting political orientation and measuring political homophily in Twitter using big data. J. Commun. 64, 317–332 (2014).

    Google Scholar 

  90. Leviston, Z., Walker, I. & Morwinski, S. Your opinion on climate change might not be as common as you think. Nat. Clim. Chang. 3, 334–337 (2013).

    Google Scholar 

  91. Baumann, F., Lorenz-Spreen, P., Sokolov, I., Starnini, M., Modeling echo chambers and polarization dynamics in social networks. Phys. Rev. Letters (in the press).

  92. Sunstein, C. R. The law of group polarization. J. Polit. Philos. 10, 175–195 (2002).

    Google Scholar 

  93. Sunstein, C.R. Conspiracy Theories and Other Dangerous Ideas. (Simon and Schuster, 2014).

  94. Van der Linden, S. The conspiracy-effect: exposure to conspiracy theories (about global warming) decreases pro-social behavior and science acceptance. Pers. Individ. Dif. 87, 171–173 (2015).

  95. Lewandowsky, S., Oberauer, K. & Gignac, G. E. NASA faked the moon landing–therefore, (climate) science is a hoax: an anatomy of the motivated rejection of science. Psychol. Sci. 24, 622–633 (2013).

    Google Scholar 

  96. Scheufele, D. A. & Krause, N. M. Science audiences, misinformation, and fake news. Proc. Natl Acad. Sci. USA 116, 7662–7669 (2019).

    CAS  Google Scholar 

  97. Lewandowsky, S., Cook, J., Fay, N. & Gignac, G. E. Science by social media: attitudes towards climate change are mediated by perceived social consensus. Mem. Cognit. 47, 1445–1456 (2019).

    Google Scholar 

  98. Muchnik, L., Aral, S. & Taylor, S. J. Social influence bias: a randomized experiment. Science 341, 647–651 (2013).

    CAS  Google Scholar 

  99. Alipourfard, N., Nettasinghe, B., Abeliuk, A., Krishnamurthy, V. & Lerman, K. Friendship paradox biases perceptions in directed networks. Nat. Commun. 11, 707 (2020).

    CAS  Google Scholar 

  100. Pennycook, G. & Rand, D. G. Fighting misinformation on social media using crowdsourced judgments of news source quality. Proc. Natl Acad. Sci. USA 116, 2521–2526 (2019).

    CAS  Google Scholar 

  101. Ecker, U. K., Lewandowsky, S. & Tang, D. T. Explicit warnings reduce but do not eliminate the continued influence of misinformation. Mem. Cognit. 38, 1087–1100 (2010).

    Google Scholar 

  102. Lewandowsky, S., Ecker, U. K., Seifert, C. M., Schwarz, N. & Cook, J. Misinformation and its correction: continued influence and successful debiasing. Psychol. Sci. Public Interest 13, 106–131 (2012).

    Google Scholar 

  103. Rosen, G., Harbath, K., Gleicher, N. & Leathern, R. Helping to protect the 2020 US elections. Facebook https://about.fb.com/news/2019/10/update-on-election-integrity-efforts/ (accessed 22 January 2020).

  104. Wineburg, S. & McGrew, S. Lateral reading: reading less and learning more when evaluating digital information. Working Paper No 2017.A1/Stanford History Education Group https://ssrn.com/abstract=3048994 (2017).

  105. Schultz, P. W., Nolan, J. M., Cialdini, R. B., Goldstein, N. J. & Griskevicius, V. The constructive, destructive, and reconstructive power of social norms. Psychol. Sci. 18, 429–434 (2007).

    Google Scholar 

  106. Hoffrage, U., Lindsey, S., Hertwig, R. & Gigerenzer, G. Communicating statistical information. Science 290, 2261–2262 (2000).

    CAS  Google Scholar 

  107. Tucker, J. A., Theocharis, Y., Roberts, M. E. & Barberá, P. From liberation to turmoil: social media and democracy. J. Democracy 28, 46–59 (2017).

    Google Scholar 

  108. Facebook for Business. Capturing attention in feed: the science behind effective video creative. https://www.facebook.com/business/news/insights/capturing-attention-feed-video-creative (accessed 8 December 2019).

  109. Kozyreva, A., Lewandowsky, S. & Hertwig, R. Citizens versus the internet: confronting digital challenges with cognitive tools. Preprint at PsyArXiv https://psyarxiv.com/ky4x8/ (2019).

  110. Reijula, S. & Hertwig, R. Self-nudging and the citizen choice architect. Behav. Publ. Policy https://doi.org/10.1017/bpp.2020.5 (2020).

  111. Noriega-Campero, A. et al. Adaptive social networks promote the wisdom of crowds. Proc. Natl Acad. Sci. USA 117, 11379–11386 (2020).

    Google Scholar 

  112. Vosoughi, S. Automatic detection and verification of rumors on Twitter. Doctoral dissertation, Massachusetts Institute of Technology (2015).

  113. Zhou, X. & Zafarani, R. Fake news: a survey of research, detection methods, and opportunities. Preprint at arXiv https://arxiv.org/abs/1812.00315 (2018).

  114. Martignon, L., Katsikopoulos, K. V. & Woike, J. K. Categorization with limited resources: A family of simple heuristics. J. Math. Psychol. 52, 352–361 (2008).

    Google Scholar 

  115. Phillips, N. D., Neth, H., Woike, J. K. & Gaissmaier, W. FFTrees: a toolbox to create, visualize, and evaluate fast-and-frugal decision trees. Judgm. Decis. Mak. 12, 344–368 (2017).

    Google Scholar 

  116. Banerjee, S., Chua, A. Y. & Kim, J. J. Don’t be deceived: using linguistic analysis to learn how to discern online review authenticity. J. Assoc. Inf. Sci. Technol. 68, 1525–1538 (2017).

    Google Scholar 

  117. Cook, J., Lewandowsky, S. & Ecker, U. K. H. Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence. PLoS ONE 12, e0175799 (2017).

    Google Scholar 

  118. Roozenbeek, J. & van der Linden, S. The fake news game: actively inoculating against the risk of misinformation. J. Risk Res. 22, 570–580 (2018).

    Google Scholar 

  119. Basol, M., Roozenbeek, J. & van der Linden, S. Good news about bad news: gamified inoculation boosts confidence and cognitive immunity against fake news. J. Cognition 3, 2 (2020).

    Google Scholar 

  120. Dias, N., Pennycook, G. & Rand, D. G. Emphasizing publishers does not effectively reduce susceptibility to misinformation on social media. Harvard Kennedy School Misinformation Review https://doi.org/10.37016/mr-2020-001 (2020).

    Article  Google Scholar 

Download references

Acknowledgements

We thank A. Kozyreva and S. Herzog for their helpful comments and D. Ain for editing the manuscript. R.H. and S.L. acknowledge support from the Volkswagen Foundation. The funders had no role in study design, data collection and analysis, decision to publish or preparation of the manuscript.

Author information

Authors and Affiliations

Authors

Contributions

P.L.S., S.L. and R.H. conceptualized the project; P.L.S., S.L., C.R.S. and R.H. wrote the manuscript.

Corresponding author

Correspondence to Philipp Lorenz-Spreen.

Ethics declarations

Competing interests

C.R.S. has served as a paid consultant on a few occasions for Facebook.

Additional information

Peer review information Primary handling editors: Mary Elizabeth Sutherland and Stavroula Kousta

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lorenz-Spreen, P., Lewandowsky, S., Sunstein, C.R. et al. How behavioural sciences can promote truth, autonomy and democratic discourse online. Nat Hum Behav 4, 1102–1109 (2020). https://doi.org/10.1038/s41562-020-0889-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Version of record:

  • Issue date:

  • DOI: https://doi.org/10.1038/s41562-020-0889-7

This article is cited by

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing