When communicating psychological intervention research, two pernicious tendencies have become prominent: using imprecise terms with lay meanings and sensationalizing outcome descriptions. This Comment examines the consequences of these communication styles and proposes strategies for effective communication, ensuring enthusiasm does not come at the cost of credibility.
Science communication does more than inform—it shapes careers, policies, and lives. The words researchers choose can determine whether an idea thrives or fades away. When an idea is branded as ‘cutting edge,’ graduate students may dedicate years to studying it. If outcomes are described as ‘high impact,’ funders may be more likely to support the work. And if researchers describe their work as ‘transformative,’ members of the public may be more likely to embrace those ideas when trying to improve their lives or help their children succeed.
Communication that generates excitement for a research idea increases media coverage, funding, and research attention. This increased attention then generates additional excitement, creating a self-reinforcing cycle. Yet this cycle has a hidden cost: to sustain interest, each new claim must be equally, if not more, sensational than the last.
Here, we describe two common communication styles for psychological interventions: (1) the use of imprecise, colloquial terms, and (2) evocative, unrealistic descriptions of outcomes. We contend that the widespread scientific and public interest in psychological interventions is likely due in part to how they are communicated.
Common communication styles about psychological theories and interventions
Colloquial terms
Terms used to describe psychological concepts often carry lay connotations that differ from their scientific definitions. Terms such as “mindset” and “wise” invoke a sense of familiarity and an intuitive—but not necessarily accurate—understanding of their meaning. For example, within mindset theory, “mindset” usually refers to one’s belief about the changeability of a human attribute—someone with a “growth mindset” holds that traits are malleable, whereas someone with a “fixed mindset” holds that traits are relatively stable. In contrast, outside of mindset theory, “mindset” usually refers to a way of thinking or a set of attitudes. Before “growth” and “fixed” entered the collective vocabulary, mindset was frequently preceded with “positive”—i.e., to have a positive mindset. Critically, “mindset” is still used with its original meaning. When someone first encounters mindset as a research concept or an intervention, the idea may seem familiar because people already know the word “mindset.” Familiarity leads to fluency, which leads to signals of credibility and acceptance without scrutiny.
Likewise, the word “wise” has had positive associations for centuries. Most people understand the word as a description of someone who makes use of substantial experience and knowledge to make good decisions. But in psychological intervention research, a “wise” intervention is one that focuses on how people interpret their social situation or themselves1—this use of the term has little relation to the lay understanding of the word. As such, people may assume the colloquial connotation that the interventions make use of substantial experience and knowledge. Again, the intuitive, familiar term with positive connotations disinvites scrutiny. Stated differently, a “wise” intervention sounds like something everyone should do, whereas a “subjective construal” or even a “situation interpretation” intervention may invite additional questions.
Examples other than “mindset” and “wise” include “deliberate practice,” “mindfulness,” and “grit.” The use of these terms is likely to invoke a sense of credibility and acceptance over scrutiny and contemplation. This, in turn, increases the risk of misunderstandings.
Evocative descriptions of results
Beyond issues with the terms themselves, descriptions of psychological intervention outcomes are often evocative. For example, a recent book on “wise” interventions was titled “Ordinary Magic2.” In the book, the outcomes of psychological interventions are described as “extraordinary,” “transformative,” and—repeatedly—as “magic2.” Describing intervention outcomes as “magic” conjures a sense of profound, life-changing effects. This language carries the risk of creating unrealistic expectations and undermining trust, especially when effects are small, temporary, or fail to replicate.
Evocative descriptions from researchers are not limited to popular press books. For example, academic articles have claimed that teaching a growth mindset can advance peace in the Middle East3. Yet, Middle East conflicts stem from deeply entrenched, multifaceted, and interconnected factors, making it improbable (and arguably dismissive of the region’s complexities) that a growth mindset intervention could resolve them. As another example, brain training research led to a multi-billion-dollar industry4. Yet, companies such as Lumos Labs eventually faced charges for deceptive claims that their products could stave off cognitive deficits related to Alzheimer’s, traumatic brain injury, and post-traumatic stress5.
Finally, the news media pulls from popular press books, journal articles, and press releases, often further sensationalizing or promoting evocative descriptions of these interventions. Compounding this issue, many news outlets have shifted from employing trained science journalists to general reporters who lack the expertise to scrutinize scientific results6. These journalists face increasing pressure to produce news quickly and generate compelling content for clicks, which amplifies inaccurate and exaggerated scientific communication6.
Strategies for communication that foster enthusiasm without overstating effects
Table 1 provides examples of key questions that scholars, members of the media, and policymakers can ask when communicating research and/or interpreting scientific communication.
Strategies for scholars
Naming new psychological concepts is a critical task. Using established words that carry lay meanings with positive connotations (e.g., “wise” interventions) presents a risk of people feeling they already understand the concept and embracing the idea without question. Conversely, overly technical jargon (e.g., “subjective construal” interventions) can deter engagement. Researchers can aim for a middle ground, with terms that are descriptive yet invite inquiry (e.g., “situation interpretation” interventions).
Along with choosing terms carefully, providing a clear definition of the construct and intervention can further prevent misinterpretation. For example, what are the critical “active ingredients” in the intervention, according to theory? Accurate communication need not be boring: scholars can explain how the idea builds on or diverges from prior work and why it addresses a critical gap.
Likewise, descriptions of outcomes can be exciting without going beyond the data. Describing outcomes as “profound7,” “transformative8,” or “magic2,” engages the public, but implies effects are universal and effortless, obscuring the nuanced reality of psychological change. Instead, scholars can provide effects in real-world terms that lay people can understand. For example, a Cohen’s d of 0.5 indicates that, when assumptions are met, the average treated person fares better than approximately two-thirds of the control group. As another example, an effect size can be reported as the number of people in the sample who matched the theoretical expectation9—for example, reporting that 75% of the treated individuals saw greater improvements than participants in the control group. In the previous examples, readers may very well think of these effects as profound or transformative. In cases where the numbers suggest that the average treated person fares better than barely over 50% of the control group, or when effects are inconsistent, readers may not interpret the effects as profound. Providing real numbers that are easily understandable, rather than evocative descriptions, allows readers to form their own conclusions about the results.
The same recommendations hold when describing effects that only apply to a subsample. Subgroup effects can be meaningful, especially when they benefit underrepresented or at-risk populations. However, given the near-infinite number of ways that a sample can be divided, subgroup-related findings warrant cautious interpretation and replication. Communicating whether similar effects have previously been observed for the same group and whether the effect for that group differs from other groups can yield credence to claims about the importance of subgroup effects.
Finally, beyond reporting their own results, scholars should be mindful of how they engage with the media. Media that communicate results without overstating effects provides a service to the public. Media that uses vague terms with evocative descriptions risks public disillusionment, where people may feel misled if their own outcomes are less than profound, transformative, or magical.
Strategies for media
Although individual reporters cannot eliminate the pressures of fast-paced sensationalism, journalists covering interventions can gather sufficient information to estimate, based on the data, how much a given person might realistically benefit. For example, journalists can ask authors if any real-world outcomes were measured and to describe the costs, benefits, and risks of the target intervention compared with another intervention or business as usual4. The same practices can apply to book editors working with authors seeking to translate their science to the general public. Though sensational claims may be more likely to sell, a publisher can develop a reputation for not overselling claims, increasing trust in their brand.
Strategies for policymakers
Policymakers, who are often charged with finding the most beneficial outcomes, can be susceptible to embracing ideas when benefits are described in evocative ways. Policymakers include funding entities, school district superintendents, and even parents deciding how to shape their children’s time. When deciding whether to implement an intervention, policymakers can try to determine if the benefits outweigh the known costs and risks. If costs and risks remain unknown, policymakers may wish to make the decision with extra care, or, in the case of funding entities, seek research examining costs and risks so that other policymakers can make well-informed decisions rather than relying on evocative descriptions.
Conclusion
The stakes are high: How science is communicated to the public can influence major decisions by individuals, parents, and policymakers. Common communicating styles surrounding psychological interventions may not only lead to unrealistic expectations, but erode the public’s trust in psychological science. Researchers must balance accessibility with accuracy, journalists must probe beyond press releases, and policymakers must demand evidence over hype.
References
Walton, G. M. & Wilson, T. D. Wise interventions: psychological remedies for social and personal problems. Psychol. Rev. 125, 617 (2018).
Walton, G. M. Ordinary Magic: The Science of How We Can Achieve Big Change with Small Acts. (Random House, New York, 2025).
Dweck, C. S. Mindsets and human nature: promoting change in the Middle East, the schoolyard, the racial divide, and willpower. Am. Psychol. 67, 614–622 (2012).
Simons, D. J. et al. Do “brain training” programs work?. Psychol. Sci. Public Interest. 17, 103–186 (2016).
Federal Trade Committee. Lumosity to Pay $2 Million to Settle FTC Deceptive Advertising Charges for its “Brain Training” Program. Retrieved July 16, 2025 from https://www.ftc.gov/news-events/news/press-releases/2016/01/lumosity-pay-2-million-settle-ftc-deceptive-advertising-charges-its-brain-training-program (Federal Trade Committee, 2016).
Dempster, G., Sutherland, G. & Keogh, L. Scientific research in news media: a case study of misrepresentation, sensationalism and harmful recommendations. J. Sci. Commun. 21, A06 (2022).
Dweck, C. S. (2012). In Handbook of Theories of Social Psychology. (eds Van Lang, P. A. M., Kruglansky, A. W. & Higgins, E. T.). (SAGE, London, 2012).
Boaler, J. et al. The transformative impact of a mathematical mindset experience taught at scale. Front. Educ. 6, 784393 (2021).
Grice, J. W. et al. Persons as effect sizes. Adv. Methods Pract. Psychol. Sci. 3, 443–455 (2020).
Author information
Authors and Affiliations
Contributions
B.N.M.: conceptualization, writing—original draft, writing—review and editing. A.P.B.: writing—review and editing. D.M.: conceptualization, writing—review and editing.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Peer review
Peer review information
Primary handling editor: Marike Schiffer. A peer review file is available.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Macnamara, B.N., Burgoyne, A.P. & Moreau, D. Communicating science, not magic. Commun Psychol 3, 119 (2025). https://doi.org/10.1038/s44271-025-00301-x
Received:
Accepted:
Published:
Version of record:
DOI: https://doi.org/10.1038/s44271-025-00301-x
This article is cited by
-
Credible is in
Communications Psychology (2025)