Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Advertisement

Scientific Reports
  • View all journals
  • Search
  • My Account Login
  • Content Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • RSS feed
  1. nature
  2. scientific reports
  3. articles
  4. article
Gaze patterns during visual mental imagery reflect part-based generation
Download PDF
Download PDF
  • Article
  • Open access
  • Published: 13 January 2026

Gaze patterns during visual mental imagery reflect part-based generation

  • Enea J. Weber1 &
  • Fred W. Mast1 

Scientific Reports , Article number:  (2026) Cite this article

  • 1380 Accesses

  • 3 Altmetric

  • Metrics details

We are providing an unedited version of this manuscript to give early access to its findings. Before final publication, the manuscript will undergo further editing. Please note there may be errors present which affect the content, and all legal disclaimers apply.

Subjects

  • Human behaviour
  • Psychology

Abstract

Eye movements during visual mental imagery resemble those made during prior perception. Across two experiments, we investigated whether eye movements during imagery reflect a part-by-part generation of mental images, by comparing gaze patterns during mental imagery to those during part-based viewing (using a gaze-contingent window, GCW) and to those during holistic viewing (using an artificial scotoma, AS). In Experiment 1, participants freely encoded and imagined pictures before reinspecting them either part-by-part (GCW condition), or holistically (AS condition). The results show that fixation scanpaths (MultiMatch) and refixation patterns (recurrence quantification analysis) during mental imagery largely mirror those during GCW viewing. In Experiment 2, we examined whether this effect depends on prior perceptual encoding. Pictures were initially encoded either freely, with the AS, or with the GCW, and subsequently imagined. The results show that regardless of how the pictures were initially encoded, gaze patterns during mental imagery systematically resembled part-based perception. The current study provides direct evidence that eye movements during mental imagery reflect a part-by-part generation process of the imagined content, independent of prior perceptual encoding.

Similar content being viewed by others

GazeBase, a large-scale, multi-stimulus, longitudinal eye movement dataset

Article Open access 16 July 2021

All-optical geometric image transformations enabled by ultrathin metasurfaces

Article Open access 15 December 2023

Instant interaction driven adaptive gaze control interface

Article Open access 22 May 2024

Data availability

All materials, data, and analysis scripts are made publicly available: https://osf.io/zaht8/?view_only=bea57fcc57c14c05aeb6815e4d080839.

References

  1. Altmann, G. T. M. Language-mediated eye movements in the absence of a visual world: The ‘blank screen paradigm’. Cognition93, B79–B87. https://doi.org/10.1016/j.cognition.2004.02.005 (2004).

    Google Scholar 

  2. Johansson, R., Holsanova, J. & Holmqvist, K. Pictures and spoken descriptions elicit similar eye movements during mental imagery, both in light and in complete darkness. Cognit. Sci.30, 1053–1079. https://doi.org/10.1207/s15516709cog0000_86 (2006).

    Google Scholar 

  3. Ferreira, F., Apel, J. & Henderson, J. M. Taking a new look at looking at nothing. Trends Cognit. Sci.12, 405–410. https://doi.org/10.1016/j.tics.2008.07.007 (2008).

    Google Scholar 

  4. Martarelli, C. S. & Mast, F. W. Eye movements during long-term pictorial recall. Psychol. Res.77, 303–309. https://doi.org/10.1007/s00426-012-0439-7 (2013).

    Google Scholar 

  5. Johansson, R., Holsanova, J., Dewhurst, R. & Holmqvist, K. Eye movements during scene recollection have a functional role, but they are not reinstatements of those produced during encoding. J. Exp. Psychol. Hum. Percept. Perform.38, 1289–1314. https://doi.org/10.1037/a0026585 (2012).

    Google Scholar 

  6. Laeng, B. & Teodorescu, D.-S. Eye scanpaths during visual imagery reenact those of perception of the same visual scene. Cognit. Sci.26, 207–231. https://doi.org/10.1207/s15516709cog2602_3 (2002).

    Google Scholar 

  7. Laeng, B., Bloem, I. M., D’Ascenzo, S. & Tommasi, L. Scrutinizing visual images: The role of gaze in mental imagery and memory. Cognition131, 263–283. https://doi.org/10.1016/j.cognition.2014.01.003 (2014).

    Google Scholar 

  8. Johansson, R. & Johansson, M. Look here, eye movements play a functional role in memory retrieval. Psychol. Sci.25, 236–242. https://doi.org/10.1177/0956797613498260 (2014).

    Google Scholar 

  9. Foulsham, T. & Kingstone, A. Fixation-dependent memory for natural scenes: An experimental test of scanpath theory. J. Exp. Psychol. Gen.142, 41–56. https://doi.org/10.1037/a0028227 (2013).

    Google Scholar 

  10. Mast, F. W. & Kosslyn, S. M. Eye movements during visual mental imagery. Trends Cognit. Sci.6, 271–272. https://doi.org/10.1016/S1364-6613(02)01931-9 (2002).

    Google Scholar 

  11. Richardson, D. C. & Spivey, M. J. Representation, space and hollywood squares: Looking at things that aren’t there anymore. Cognition76, 269–295. https://doi.org/10.1016/S0010-0277(00)00084-6 (2000).

    Google Scholar 

  12. Hoover, M. A. & Richardson, D. C. When facts go down the rabbit hole: Contrasting features and objecthood as indexes to memory. Cognition108, 533–542. https://doi.org/10.1016/j.cognition.2008.02.011 (2008).

    Google Scholar 

  13. Kumcu, A. & Thompson, R. L. Spatial interference and individual differences in looking at nothing for verbal memory. In Proceedings of the Annual Meeting of the Cognitive Science Society. Vol. 38 (2016).

  14. Hesslow, G. The current status of the simulation theory of cognition. Brain Res.1428, 71–79. https://doi.org/10.1016/j.brainres.2011.06.026 (2012).

    Google Scholar 

  15. Moulton, S. T. & Kosslyn, S. M. Imagining predictions: Mental imagery as mental emulation. Philos. Trans. R. Soc. B Biol. Sci.364, 1273–1280. https://doi.org/10.1098/rstb.2008.0314 (2009).

    Google Scholar 

  16. Dijkstra, N., Zeidman, P., Ondobaka, S., van Gerven, M.A. J. & Friston, K. Distinct top-down and bottom-up brain connectivity during visual perception and imagery. Sci. Rep.7, 5677. https://doi.org/10.1038/s41598-017-05888-8 (2017).

  17. Xie, S., Kaiser, D. & Cichy, R. M. Visual imagery and perception share neural representations in the alpha frequency band. Curr. Biol.30, 2621-2627.e5. https://doi.org/10.1016/j.cub.2020.04.074 (2020).

    Google Scholar 

  18. Dijkstra, N., Bosch, S. E. & Gerven, M. A.J.V. Vividness of visual imagery depends on the neural overlap with perception in visual areas. J. Neurosci.37, 1367–1373. https://doi.org/10.1523/JNEUROSCI.3022-16.2016 (2017).

  19. Pearson, J., Clifford, C. W. G. & Tong, F. The functional impact of mental imagery on conscious perception. Curr. Biol.18, 982–986. https://doi.org/10.1016/j.cub.2008.05.048 (2008).

    Google Scholar 

  20. Dijkstra, N., Mazor, M., Kok, P. & Fleming, S. Mistaking imagination for reality: Congruent mental imagery leads to more liberal perceptual detection. Cognition212, 104719. https://doi.org/10.1016/j.cognition.2021.104719 (2021).

    Google Scholar 

  21. Moro, V., Berlucchi, G., Lerch, J., Tomaiuolo, F. & Aglioti, S. M. Selective deficit of mental visual imagery with intact primary visual cortex and visual perception. Cortex J. Devot. Study Nervous Syst. Behav.44, 109–118. https://doi.org/10.1016/j.cortex.2006.06.004 (2008).

    Google Scholar 

  22. Behrmann, M., Winocur, G. & Moscovitch, M. Dissociation between mental imagery and object recognition in a brain-damaged patient. Nature359, 636–637. https://doi.org/10.1038/359636a0 (1992).

    Google Scholar 

  23. Behrmann, M., Moscovitch, M. & Winocur, G. Intact visual imagery and impaired visual perception in a patient with visual agnosia. J. Exp. Psychol. Hum. Percept. Perform.20, 1068–1087. https://doi.org/10.1037/0096-1523.20.5.1068 (1994).

    Google Scholar 

  24. Hebb, D. O. Concerning imagery. Psychol. Rev.75, 466–477. https://doi.org/10.1037/h0026771 (1968).

    Google Scholar 

  25. Neisser, U. Cognitive psychology. In Century Psychology Series (Appleton-Century-Crofts, 1967) (OCLC: 192730).

  26. Bourlon, C., Oliviero, B., Wattiez, N., Pouget, P. & Bartolomeo, P. Visual mental imagery: What the head’s eye tells the mind’s eye. Brain Res.1367, 287–297. https://doi.org/10.1016/j.brainres.2010.10.039 (2011).

    Google Scholar 

  27. Gurtner, L. M., Bischof, W. F. & Mast, F. W. Recurrence quantification analysis of eye movements during mental imagery. J. Vis.19, 17. https://doi.org/10.1167/19.1.17 (2019).

    Google Scholar 

  28. Gurtner, L. M., Hartmann, M. & Mast, F. W. Eye movements during visual imagery and perception show spatial correspondence but have unique temporal signatures. Cognition210, 104597. https://doi.org/10.1016/j.cognition.2021.104597 (2021).

    Google Scholar 

  29. Gurtner, L. M., Bischof, W. F. & Mast, F. W. Gaze restriction and reactivation of place-bound content drive eye movements during mental imagery. J. Cognit.6. https://doi.org/10.5334/joc.316 (2023).

  30. Peelen, M. V., Berlot, E. & de Lange, F. P. Predictive processing of scenes and objects. Nat. Rev. Psychol.3, 13–26. https://doi.org/10.1038/s44159-023-00254-0 (2024).

    Google Scholar 

  31. Kosslyn, S. M., Thompson, W. L. & Ganis, G. The Case for Mental Imagery (Oxford University Press, 2006).

    Google Scholar 

  32. van Diepen, P. M. J., Wampers, M. & d’Ydewalle, G. Functional division of the visual field: Moving masks and moving windows. In Eye Guidance in Reading and Scene Perception. 337–355. https://doi.org/10.1016/B978-008043361-5/50016-X (Elsevier Science Ltd, 1998).

  33. Hagen, S. et al. A perceptual field test in object experts using gaze-contingent eye tracking. Sci. Rep.13, 11437. https://doi.org/10.1038/s41598-023-37695-9 (2023).

    Google Scholar 

  34. Van Belle, G. et al. Impairment of holistic face perception following right occipito-temporal damage in prosopagnosia: converging evidence from gaze-contingency. Neuropsychologia49, 3145–3150. https://doi.org/10.1016/j.neuropsychologia.2011.07.010 (2011).

    Google Scholar 

  35. Van Belle, G., De Graef, P., Verfaillie, K., Rossion, B. & Lefèvre, P. Face inversion impairs holistic perception: Evidence from gaze-contingent stimulation. J. Vis.10. https://doi.org/10.1167/10.5.10 (2010).

  36. Bombari, D., Mast, F. W. & Lobmaier, J. S. Featural, configural, and holistic face-processing strategies evoke different scan patterns. Perception38, 1508–1521. https://doi.org/10.1068/p6117 (2009).

    Google Scholar 

  37. Schwarzer, G., Huber, S. & Dümmler, T. Gaze behavior in analytical and holistic face processing. Mem. Cognit.33, 344–354. https://doi.org/10.3758/BF03195322 (2005).

    Google Scholar 

  38. Brandt, S. A. & Stark, L. W. Spontaneous eye movements during visual imagery reflect the content of the visual scene. J. Cognit. Neurosci.9, 27–38. https://doi.org/10.1162/jocn.1997.9.1.27 (1997).

    Google Scholar 

  39. Hassabis, D. & Maguire, E. A. The construction system of the brain. Philos. Trans. R. Soc. B Biol. Sci.364, 1263–1271. https://doi.org/10.1098/rstb.2008.0296 (2009).

    Google Scholar 

  40. Damiano, C. & Walther, D. B. Distinct roles of eye movements during memory encoding and retrieval. Cognition184, 119–129. https://doi.org/10.1016/j.cognition.2018.12.014 (2019).

    Google Scholar 

  41. Wynn, J. S., Ryan, J. D. & Buchsbaum, B. R. Eye movements support behavioral pattern completion. Proc. Natl. Acad. Sci.117, 6246–6254. https://doi.org/10.1073/pnas.1917586117 (2020).

    Google Scholar 

  42. Martarelli, C. S. & Mast, F. W. Pictorial low-level features in mental images: Evidence from eye fixations. Psychol. Res.86, 350–363. https://doi.org/10.1007/s00426-021-01497-3 (2022).

    Google Scholar 

  43. Anderson, N. C., Bischof, W. F., Laidlaw, K. E. W., Risko, E. F. & Kingstone, A. Recurrence quantification analysis of eye movements. Behav. Res. Methods45, 842–856. https://doi.org/10.3758/s13428-012-0299-5 (2013).

    Google Scholar 

  44. Ballard, D. H., Hayhoe, M. M. & Pelz, J. B. Memory representations in natural tasks. J. Cognit. Neurosci.7, 66–80. https://doi.org/10.1162/jocn.1995.7.1.66 (1995).

    Google Scholar 

  45. Ryan, J. D. & Villate, C. Building visual representations: The binding of relative spatial relations across time. Vis. Cognit.17, 254–272. https://doi.org/10.1080/13506280802336362 (2009).

    Google Scholar 

  46. Meghanathan, R. N., Nikolaev, A. R. & van Leeuwen, C. Refixation patterns reveal memory-encoding strategies in free viewing. Attent. Percept. Psychophys.81, 2499–2516. https://doi.org/10.3758/s13414-019-01735-2 (2019).

    Google Scholar 

  47. Scholz, A., Klichowicz, A. & Krems, J. F. Covert shifts of attention can account for the functional role of “eye movements to nothing’’. Mem. Cognit.46, 230–243. https://doi.org/10.3758/s13421-017-0760-x (2018).

    Google Scholar 

  48. Kozhevnikov, M., Kosslyn, S. & Shephard, J. Spatial versus object visualizers: A new characterization of visual cognitive style. Mem. Cognit.33, 710–726. https://doi.org/10.3758/BF03195337 (2005).

    Google Scholar 

  49. Blazhenkova, O. Vividness of object and spatial imagery. Percept. Motor Skills122, 490–508. https://doi.org/10.1177/0031512516639431 (2016).

    Google Scholar 

  50. Johansson, R., Holsanova, J. & Homqvist, K. The dispersion of eye movements during visual imagery is related to individual differences in spatial imagery ability. Proc. Annu. Meet. Cognit. Sci. Soc.33 (2011).

  51. Perkovic, S., Schoemann, M., Lagerkvist, C.-J. & Orquin, J. L. Covert attention leads to fast and accurate decision-making. J. Exp. Psychol. Appl.29, 78–94. https://doi.org/10.1037/xap0000425 (2023).

    Google Scholar 

  52. Kosslyn, S. M. Image and Brain: The Resolution of the Imagery Debate. Vol. viii. 516 (The MIT Press, 1994).

  53. Henderson, J. M. Visual attention and eye movement control during reading and picture viewing. In Eye Movements and Visual Cognition: Scene Perception and Reading (ed. Rayner, K.) 260–283 (Springer, 1992). https://doi.org/10.1007/978-1-4612-2852-3_15.

    Google Scholar 

  54. Farnand, S., Vaidyanathan, P. & Pelz, J. B. Recurrence metrics for assessing eye movements in perceptual experiments. J. Eye Mov. Res.9. https://doi.org/10.16910/jemr.9.4.1 (2016).

  55. Vaidyanathan, P., Pelz, J., Alm, C., Shi, P. & Haake, A. Recurrence quantification analysis reveals eye-movement behavior differences between experts and novices. In Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA ’14. 303–306. https://doi.org/10.1145/2578153.2578207 (Association for Computing Machinery, 2014).

  56. Miles, W. R. Ocular dominance in human adults. J. Gen. Psychol.3, 412–430. https://doi.org/10.1080/00221309.1930.9918218 (1930).

    Google Scholar 

  57. Hessels, R. S., Niehorster, D. C., Nyström, M., Andersson, R. & Hooge, I. T. C. Is the eye-movement field confused about fixations and saccades? A survey among 124 researchers. R. Soc. Open Sci.5, 180502. https://doi.org/10.1098/rsos.180502 (2018).

    Google Scholar 

  58. Cornelissen, F. W., Bruin, K. J. & Kooijman, A. C. The influence of artificial scotomas on eye movements during visual search. Optom. Vis. Sci.82, 27. https://doi.org/10.1097/01.OPX.0000150250.14720.C5 (2005).

    Google Scholar 

  59. Adobe. Adobe Photoshop (Computer Software, 2025).

  60. Bylinskii, Z., Isola, P., Bainbridge, C., Torralba, A. & Oliva, A. Intrinsic and extrinsic effects on image memorability. Vis. Res.116, 165–178. https://doi.org/10.1016/j.visres.2015.03.005 (2015).

    Google Scholar 

  61. Khosla, A., Raju, A. S., Torralba, A. & Oliva, A. Understanding and predicting image memorability at a large scale. In 2015 IEEE International Conference on Computer Vision (ICCV). 2390–2398. https://doi.org/10.1109/ICCV.2015.275 (2015).

  62. Marks, D. F. New directions for mental imagery research. J. Ment. Imag.19, 153–167 (1995).

    Google Scholar 

  63. Blazhenkova, O. & Kozhevnikov, M. The new object-spatial-verbal cognitive style model: Theory and measurement. Appl. Cognit. Psychol.23, 638–663. https://doi.org/10.1002/acp.1473 (2009).

    Google Scholar 

  64. Vandenberg, S. G. & Kuse, A. R. Mental rotations, a group test of three-dimensional spatial visualization. Percept. Motor Skills47, 599–604. https://doi.org/10.2466/pms.1978.47.2.599 (1978).

    Google Scholar 

  65. Inc., T. M. MATLAB Version: 9.13.0 (R2022b). (Natick, 2022).

  66. Brainard, D. H. The psychophysics toolbox. Spatial Vis.10, 433–436 (1997).

    Google Scholar 

  67. Jarodzka, H., Holmqvist, K. & Nyström, M. A vector-based, multidimensional scanpath similarity measure. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, ETRA ’10. 211–218. https://doi.org/10.1145/1743666.1743718 (Association for Computing Machinery, 2010).

  68. Dewhurst, R. et al. It depends on how you look at it: Scanpath comparison in multiple dimensions with MultiMatch, a vector-based approach. Behav. Res. Methods44, 1079–1100. https://doi.org/10.3758/s13428-012-0212-2 (2012).

    Google Scholar 

  69. Anderson, N. C., Anderson, F., Kingstone, A. & Bischof, W. F. A comparison of scanpath comparison methods. Behav. Res. Methods47, 1377–1392. https://doi.org/10.3758/s13428-014-0550-3 (2015).

    Google Scholar 

  70. Webber, C., Ioana, C. & Marwan, N. Recurrence Plots and Their Quantifications: Expanding Horizons: Proceedings of the 6th International Symposium on Recurrence Plots, Grenoble, France, 17–19 June 2015 (2016).

  71. Farnand, S., Vaidyanathan, P. & Pelz, J. B. Recurrence metrics for assessing eye movements in perceptual experiments. J. Eye Mov. Res.9. https://doi.org/10.16910/jemr.9.4.1 (2016).

  72. Wagner, A. S., Halchenko, Y. O. & Hanke, M. multimatch-gaze: The MultiMatch algorithm for gaze path comparison in Python. J. Open Source Softw.4, 1525. https://doi.org/10.21105/joss.01525 (2019).

    Google Scholar 

  73. R Core Team. R: A Language and Environment for Statistical Computing. (R Foundation for Statistical Computing, 2023).

  74. RStudio Team. RStudio: Integrated Development Environment for R. (RStudio, PBC., 2020).

  75. Bürkner, P.-C. brms: An R package for Bayesian multilevel models using Stan. J. Stat. Softw.80, 1–28. https://doi.org/10.18637/jss.v080.i01 (2017).

    Google Scholar 

Download references

Acknowledgements

We thank our participants and Lorena Di Matteo, Luca Panico and Noam Sedemund for their help during data collection. We also thank Gerda Wyssen for her support and valuable insights.

Funding

This research was funded by the Swiss National Science Foundation (SNSF), grant no. 100014_214940 (PI: FWM).

Author information

Authors and Affiliations

  1. Department of Psychology, University of Bern, 3012, Bern, Switzerland

    Enea J. Weber & Fred W. Mast

Authors
  1. Enea J. Weber
    View author publications

    Search author on:PubMed Google Scholar

  2. Fred W. Mast
    View author publications

    Search author on:PubMed Google Scholar

Contributions

E.W. and F.M. conceived the experiments, E.W. conducted the experiments, E.W. analysed the results. F.W. supervised this project and acquired funding. All authors wrote and reviewed the manuscript.

Corresponding author

Correspondence to Enea J. Weber.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Supplementary Information.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Weber, E.J., Mast, F.W. Gaze patterns during visual mental imagery reflect part-based generation. Sci Rep (2026). https://doi.org/10.1038/s41598-026-35447-z

Download citation

  • Received: 13 June 2025

  • Accepted: 06 January 2026

  • Published: 13 January 2026

  • DOI: https://doi.org/10.1038/s41598-026-35447-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Keywords

  • Mental imagery
  • Eye-tracking
  • Gaze-contingent window
Download PDF

Associated content

Collection

Mental imagery and consciousness

Advertisement

Explore content

  • Research articles
  • News & Comment
  • Collections
  • Subjects
  • Follow us on Facebook
  • Follow us on Twitter
  • Sign up for alerts
  • RSS feed

About the journal

  • About Scientific Reports
  • Contact
  • Journal policies
  • Guide to referees
  • Calls for Papers
  • Editor's Choice
  • Journal highlights
  • Open Access Fees and Funding

Publish with us

  • For authors
  • Language editing services
  • Open access funding
  • Submit manuscript

Search

Advanced search

Quick links

  • Explore articles by subject
  • Find a job
  • Guide to authors
  • Editorial policies

Scientific Reports (Sci Rep)

ISSN 2045-2322 (online)

nature.com sitemap

About Nature Portfolio

  • About us
  • Press releases
  • Press office
  • Contact us

Discover content

  • Journals A-Z
  • Articles by subject
  • protocols.io
  • Nature Index

Publishing policies

  • Nature portfolio policies
  • Open access

Author & Researcher services

  • Reprints & permissions
  • Research data
  • Language editing
  • Scientific editing
  • Nature Masterclasses
  • Research Solutions

Libraries & institutions

  • Librarian service & tools
  • Librarian portal
  • Open research
  • Recommend to library

Advertising & partnerships

  • Advertising
  • Partnerships & Services
  • Media kits
  • Branded content

Professional development

  • Nature Awards
  • Nature Careers
  • Nature Conferences

Regional websites

  • Nature Africa
  • Nature China
  • Nature India
  • Nature Japan
  • Nature Middle East
  • Privacy Policy
  • Use of cookies
  • Legal notice
  • Accessibility statement
  • Terms & Conditions
  • Your US state privacy rights
Springer Nature

© 2026 Springer Nature Limited

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing