Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Advertisement

Nature Communications
  • View all journals
  • Search
  • My Account Login
  • Content Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • RSS feed
  1. nature
  2. nature communications
  3. articles
  4. article
Early spatiotemporal dynamics of navigational affordance coding in the dorsal visual cortex
Download PDF
Download PDF
  • Article
  • Open access
  • Published: 08 January 2026

Early spatiotemporal dynamics of navigational affordance coding in the dorsal visual cortex

  • Elisa Zamboni  ORCID: orcid.org/0000-0001-9200-80311 na1 nAff5,
  • Rebecca Lowndes1 na1,
  • Richard Aveyard1,
  • Catriona L. Scrivener2 nAff6,
  • Jessica A. Teed  ORCID: orcid.org/0009-0001-8821-73562,
  • Yumeng Ma2,
  • Antony B. Morland  ORCID: orcid.org/0000-0002-6754-55451,3,4 &
  • …
  • Edward H. Silson  ORCID: orcid.org/0000-0002-6149-74232 

Nature Communications , Article number:  (2026) Cite this article

  • 1363 Accesses

  • Metrics details

We are providing an unedited version of this manuscript to give early access to its findings. Before final publication, the manuscript will undergo further editing. Please note there may be errors present which affect the content, and all legal disclaimers apply.

Subjects

  • Human behaviour
  • Perception

Abstract

Successful navigation requires extracting navigationally relevant signals from a dynamically changing visual environment. The process by which we identify navigable routes through the environment is termed navigational affordances. Here, using a combination of functional magnetic resonance imaging, magnetoencephalography and behavioural testing we report that the extraction of such navigational affordance information likely takes place rapidly within dorsal early visual cortex before higher-level scene selective regions. Whilst we replicate prior work showing the involvement of the occipital place area in navigational affordance coding, whole-brain analyses indicate the most likely cortical locus to be dorsal early visual cortex. Analyses comparing the spatiotemporal pattern of navigational affordances suggest such information is detectable within ~110 milliseconds post stimulus onset. Finally, through varying the presentation durations of scenes, we demonstrate that navigational affordance representations are emergent, but not strong with stimulus durations as short as 33-66 milliseconds but become robust with stimulus durations >132 milliseconds. Taken together these data challenge previous views regarding the critical cortical locus for navigational affordance coding and suggest that such affordances can be extracted from very briefly presented stimuli.

Similar content being viewed by others

Entorhinal grid-like codes and time-locked network dynamics track others navigating through space

Article Open access 31 January 2023

Altered grid-like coding in early blind people

Article Open access 24 April 2024

Visual features are processed before navigational affordances in the human brain

Article Open access 06 March 2024

Data availability

Pre-processed MRI, MEG and behavioural data are available via the open science framework (https://doi.org/10.17605/OSF.IO/PQ2M3). Raw MRI and MEG data will be made available upon request.

Code availability

Analysis code is available via the open science framework via the open science framework (https://doi.org/10.17605/OSF.IO/PQ2M3).

References

  1. Gibson, J. J. Visually controlled locomotion and visual orientation in animals*. Br. J. Psychol. 49, 182–194 (1958).

    Google Scholar 

  2. Potter, M. C. Meaning in visual search. Science 187, 965–966 (1975).

    Google Scholar 

  3. Thorpe, S. et al. Speed of processing in the human visual system. Nature 381, 520–522 (1996).

    Google Scholar 

  4. Greene, M. R. & Oliva, A. The briefest of glances: the time course of natural scene understanding. Psychol. Sci. 20, 464–472 (2009).

    Google Scholar 

  5. Gibson, J. J. The theory of affordances: (1979). In The people, place, and space reader, pp. 56–60 (Routledge, 2014).

  6. Epstein, R. A. & Baker, C. I. Scene perception in the human brain. Annu. Rev. Vis. Sci. 5, 373–397 (2019).

    Google Scholar 

  7. Dilks, D. D. et al. The occipital place area is causally and selectively involved in scene perception. J. Neurosci. 33, 1331–1336 (2013).

    Google Scholar 

  8. Epstein, R. & Kanwisher, N. A cortical representation of the local visual environment. Nature 392, 598–601 (1998).

    Google Scholar 

  9. Epstein, R. A. Parahippocampal and retrosplenial contributions to human spatial navigation. Trends Cogn. Sci. 12, 388–396 (2008).

    Google Scholar 

  10. Silson, E. H. et al. Scene-selectivity and retinotopy in medial parietal cortex. Front. Hum. Neurosci. 10, 412 (2016)

  11. Yoon, H. K. et al. A scene-selective region in the superior parietal lobule for visually guided navigation. Cereb. Cortex 35, bhaf082 (2025).

    Google Scholar 

  12. Kennedy, B. et al. A previously undescribed scene-selective site is the key to encoding ego-motion in naturalistic environments. eLife 13, RP91601 (2024).

    Google Scholar 

  13. Bonner, M. F. & Epstein, R. A. Coding of navigational affordances in the human visual system. Proc. Natl. Acad. Sci. 114, 4793–4798 (2017).

    Google Scholar 

  14. Silson, E. H. et al. A retinotopic basis for the division of high-level scene processing between lateral and ventral human occipitotemporal cortex. J. Neurosci. 35, 11921–11935 (2015).

    Google Scholar 

  15. Silson, E. H. et al. Evaluating the correspondence between face-, scene-, and object-selectivity and retinotopic organization within lateral occipitotemporal cortex. J. Vis. 16, 1–21 (2016).

    Google Scholar 

  16. Scrivener, C. L. et al. Retinotopy drives the variation in scene responses across visual field map divisions of the occipital place area. J. Vis. 24, 10 (2024).

    Google Scholar 

  17. Bonner, M. F. & Epstein, R. A. Computational mechanisms underlying cortical responses to the affordance properties of visual scenes. PLoS Comput. Biol. 14, e1006111 (2018)

  18. Harel, A. et al. Early electrophysiological markers of navigational affordances in scenes. J. Cogn. Neurosci. 34, 397–410 (2022).

    Google Scholar 

  19. Dwivedi, K. et al. Visual features are processed before navigational affordances in the human brain. Sci. Rep. 14, 5573 (2024).

    Google Scholar 

  20. Bartnik, C. G. et al. Temporal misalignment in scene perception: Divergent representations of locomotive action affordances in human brain responses and DNNs. Neuroscience https://doi.org/10.1101/2025.03.14.642994 (2025).

  21. Bartnik, C. G. et al. Representation of locomotive action affordances in human behavior, brains and deep neural networks. Neuroscience 122, e2414005122 (2024).

  22. Groen, I. I. et al. Contributions of low-and high-level properties to neural processing of visual scenes in the human brain. Philos. Trans. R. Soc. B Biol. Sci. 372, 20160102 (2017).

    Google Scholar 

  23. Cichy, R. M. & Oliva, A. A M/EEG-fMRI fusion primer: resolving human brain responses in space and time. Neuron 107, 772–781 (2020).

    Google Scholar 

  24. Nili, H. et al. A toolbox for representational similarity analysis. PLoS Comput. Biol. 10, e1003553 (2014).

    Google Scholar 

  25. Oliva, A. & Torralba, A. Building the gist of a scene: the role of global image features in recognition. In Progress in Brain Research Ch. 2, Vol. 155, 23–36 (Elsevier, 2006).

  26. Rice, G. E. et al. Low-level image properties of visual objects predict patterns of neural response across category-selective regions of the ventral visual pathway. J. Neurosci. 34, 8837–8844 (2014).

    Google Scholar 

  27. Watson, D. M. et al. Patterns of neural response in scene-selective regions of the human brain are affected by low-level manipulations of spatial frequency. NeuroImage 124, 107–117 (2016).

    Google Scholar 

  28. Wang, L. et al. Probabilistic maps of visual topography in human cortex. Cereb. Cortex 25, 3911–3931 (2015).

    Google Scholar 

  29. Kass, R. E. & Raftery, A. E. Bayes factors. J. Am. Stat. Assoc. 90, 773–795 (1995).

    Google Scholar 

  30. Teichmann, L. et al. The nature of neural object representations during dynamic occlusion. Cortex 153, 66–86 (2022).

    Google Scholar 

  31. Rouder, J. N. et al. Default Bayes factors for ANOVA designs. J. Math. Psychol. 56, 356–374 (2012).

    Google Scholar 

  32. Morey, R. D. et al. Why hypothesis tests are essential for psychological science: a comment on cumming. Psychol. Sci. 25, 1289–1290 (2014).

    Google Scholar 

  33. Oosterhof, N. N. et al. CoSMoMVPA: multi-modal multivariate pattern analysis of neuroimaging data in Matlab/GNU Octave. Front. Neuroinformatics 10, 27 (2016).

  34. Wandell, B. A. et al. Visual field maps in human cortex. Neuron 56, 366–383 (2007).

    Google Scholar 

  35. Hebart, M. N. et al. The representational dynamics of task and object processing in humans. eLife 7, e32816 (2018).

    Google Scholar 

  36. Fei-Fei, L. et al. What do we perceive in a glance of a real-world scene?. J. Vis. 7, 10 (2007).

    Google Scholar 

  37. Malcolm, G. L. et al. Making sense of real-world scenes. Trends Cogn. Sci. 20, 843–856 (2016).

    Google Scholar 

  38. Nasr, S. et al. Thinking outside the box: rectilinear shapes selectively activate scene-selective cortex. J. Neurosci. 34, 6721–6735 (2014).

    Google Scholar 

  39. Watson, D. M. et al. Patterns of response to visual scenes are linked to the low-level properties of the image. NeuroImage 99, 402–410 (2014).

    Google Scholar 

  40. Kaiser, D. et al. Object vision in a structured world. Trends Cogn. Sci. 23, 672–685 (2019).

    Google Scholar 

  41. Cox, R. W. AFNI: software for analysis and visualization of functional magnetic resonance neuroimages. Comput. Biomed. Res. 29, 162–173 (1996).

    Google Scholar 

  42. Dale, A. M. et al. Cortical surface-based analysis. NeuroImage 9, 179–194 (1999).

    Google Scholar 

  43. Saad, Z. S. & Reynolds, R. C. SUMA. NeuroImage 62, 768–773 (2012).

    Google Scholar 

  44. Evans, J. W. et al. Separating slow BOLD from non-BOLD baseline drifts using multi-echo fMRI. NeuroImage 105, 189–197 (2015).

    Google Scholar 

  45. Kundu, P. et al. Differentiating BOLD and non-BOLD signals in fMRI time series using multi-echo EPI. NeuroImage 60, 1759–1770 (2012).

    Google Scholar 

  46. DuPre, E. et al. Multi-echo fMRI replication sample of autobiographical memory, prospection and theory of mind reasoning tasks. Sci. Data 3, 160116 (2016).

    Google Scholar 

  47. Steel, A. et al. Evaluating the efficacy of multi-echo ICA denoising on model-based fMRI. Neuroscience 264, 119723 (2022).

  48. Chen, G. et al. Is the statistic value all we should care about in neuroimaging?. NeuroImage 147, 952–959 (2017).

    Google Scholar 

  49. Scrivener, C. L. When is simultaneous recording necessary? A guide for researchers considering combined EEG-fMRI. Front. Neurosci. 15, 636424 (2021).

    Google Scholar 

  50. Teichmann, L. An empirically driven guide on using Bayes factors for M/EEG decoding. Aperture Neuro 2, 1–10 (2022).

    Google Scholar 

  51. Smith, S. & Nichols, T. Threshold-free cluster enhancement: addressing problems of smoothing, threshold dependence and localisation in cluster inference. NeuroImage 44, 83–98 (2009).

    Google Scholar 

  52. Mensen, A. & Khatami, R. Advanced EEG analysis using threshold-free cluster-enhancement and non-parametric statistics. NeuroImage 67, 111–118 (2013).

    Google Scholar 

  53. Maris, E. & Oostenveld, R. Nonparametric statistical testing of EEG- and MEG-data. J. Neurosci. Methods 164, 177–190 (2007).

    Google Scholar 

Download references

Acknowledgements

Supported by Biotechnology and Biological Sciences Research Council awards BB/V003887/1 to EHS and BB/V003917/1 to A.B.M., and the School of Philosophy, Psychology and Language Sciences Research Support Grant to J.A.T.

Author information

Author notes
  1. Elisa Zamboni

    Present address: School of Psychology, University of Nottingham, University Park, Nottingham, UK

  2. Catriona L. Scrivener

    Present address: School of Psychology and Neuroscience, University of Glasgow, Glasgow, UK

  3. These authors contributed equally: Elisa Zamboni, Rebecca Lowndes.

Authors and Affiliations

  1. Department of Psychology, University of York, York, UK

    Elisa Zamboni, Rebecca Lowndes, Richard Aveyard & Antony B. Morland

  2. Department of Psychology, School of Philosophy, Psychology & Language Sciences, University of Edinburgh, Edinburgh, UK

    Catriona L. Scrivener, Jessica A. Teed, Yumeng Ma & Edward H. Silson

  3. York Biomedical Research Institute, University of York, York, UK

    Antony B. Morland

  4. York Neuroimaging Centre, Department of Psychology, University of York, York, UK

    Antony B. Morland

Authors
  1. Elisa Zamboni
    View author publications

    Search author on:PubMed Google Scholar

  2. Rebecca Lowndes
    View author publications

    Search author on:PubMed Google Scholar

  3. Richard Aveyard
    View author publications

    Search author on:PubMed Google Scholar

  4. Catriona L. Scrivener
    View author publications

    Search author on:PubMed Google Scholar

  5. Jessica A. Teed
    View author publications

    Search author on:PubMed Google Scholar

  6. Yumeng Ma
    View author publications

    Search author on:PubMed Google Scholar

  7. Antony B. Morland
    View author publications

    Search author on:PubMed Google Scholar

  8. Edward H. Silson
    View author publications

    Search author on:PubMed Google Scholar

Contributions

E.Z., R.L., R.A. collected the fMRI and MEG data. E.Z., R.L., R.A., C.L.S. and E.H.S. analysed the fMRI and MEG data. C.L.S., J.A.T. and E.H.S. designed the behavioural experiment. Y.M. and J.A.T. collected the behavioural data. Y.M., J.A.T. and E.H.S. analysed the behavioural data. A.B.M. and E.H.S. collectively designed the fMRI and MEG experiments. E.H.S. wrote the first draft of the manuscript. All authors contributed to the final version of the manuscript.

Corresponding author

Correspondence to Edward H. Silson.

Ethics declarations

Competing interests

The authors declare no competing interests.

Peer review

Peer review information

Nature Communications thanks Eva Patai and the other, anonymous, reviewer(s) for their contribution to the peer review of this work. A peer review file is available.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Reporting Summary

Transparent Peer Review file

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zamboni, E., Lowndes, R., Aveyard, R. et al. Early spatiotemporal dynamics of navigational affordance coding in the dorsal visual cortex. Nat Commun (2026). https://doi.org/10.1038/s41467-025-68111-7

Download citation

  • Received: 15 May 2025

  • Accepted: 17 December 2025

  • Published: 08 January 2026

  • DOI: https://doi.org/10.1038/s41467-025-68111-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Download PDF

Advertisement

Explore content

  • Research articles
  • Reviews & Analysis
  • News & Comment
  • Videos
  • Collections
  • Subjects
  • Follow us on Facebook
  • Follow us on Twitter
  • Sign up for alerts
  • RSS feed

About the journal

  • Aims & Scope
  • Editors
  • Journal Information
  • Open Access Fees and Funding
  • Calls for Papers
  • Editorial Values Statement
  • Journal Metrics
  • Editors' Highlights
  • Contact
  • Editorial policies
  • Top Articles

Publish with us

  • For authors
  • For Reviewers
  • Language editing services
  • Open access funding
  • Submit manuscript

Search

Advanced search

Quick links

  • Explore articles by subject
  • Find a job
  • Guide to authors
  • Editorial policies

Nature Communications (Nat Commun)

ISSN 2041-1723 (online)

nature.com sitemap

About Nature Portfolio

  • About us
  • Press releases
  • Press office
  • Contact us

Discover content

  • Journals A-Z
  • Articles by subject
  • protocols.io
  • Nature Index

Publishing policies

  • Nature portfolio policies
  • Open access

Author & Researcher services

  • Reprints & permissions
  • Research data
  • Language editing
  • Scientific editing
  • Nature Masterclasses
  • Research Solutions

Libraries & institutions

  • Librarian service & tools
  • Librarian portal
  • Open research
  • Recommend to library

Advertising & partnerships

  • Advertising
  • Partnerships & Services
  • Media kits
  • Branded content

Professional development

  • Nature Awards
  • Nature Careers
  • Nature Conferences

Regional websites

  • Nature Africa
  • Nature China
  • Nature India
  • Nature Japan
  • Nature Middle East
  • Privacy Policy
  • Use of cookies
  • Legal notice
  • Accessibility statement
  • Terms & Conditions
  • Your US state privacy rights
Springer Nature

© 2026 Springer Nature Limited

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing