Abstract
Successful navigation requires extracting navigationally relevant signals from a dynamically changing visual environment. The process by which we identify navigable routes through the environment is termed navigational affordances. Here, using a combination of functional magnetic resonance imaging, magnetoencephalography and behavioural testing we report that the extraction of such navigational affordance information likely takes place rapidly within dorsal early visual cortex before higher-level scene selective regions. Whilst we replicate prior work showing the involvement of the occipital place area in navigational affordance coding, whole-brain analyses indicate the most likely cortical locus to be dorsal early visual cortex. Analyses comparing the spatiotemporal pattern of navigational affordances suggest such information is detectable within ~110 milliseconds post stimulus onset. Finally, through varying the presentation durations of scenes, we demonstrate that navigational affordance representations are emergent, but not strong with stimulus durations as short as 33-66 milliseconds but become robust with stimulus durations >132 milliseconds. Taken together these data challenge previous views regarding the critical cortical locus for navigational affordance coding and suggest that such affordances can be extracted from very briefly presented stimuli.
Similar content being viewed by others
Data availability
Pre-processed MRI, MEG and behavioural data are available via the open science framework (https://doi.org/10.17605/OSF.IO/PQ2M3). Raw MRI and MEG data will be made available upon request.
Code availability
Analysis code is available via the open science framework via the open science framework (https://doi.org/10.17605/OSF.IO/PQ2M3).
References
Gibson, J. J. Visually controlled locomotion and visual orientation in animals*. Br. J. Psychol. 49, 182–194 (1958).
Potter, M. C. Meaning in visual search. Science 187, 965–966 (1975).
Thorpe, S. et al. Speed of processing in the human visual system. Nature 381, 520–522 (1996).
Greene, M. R. & Oliva, A. The briefest of glances: the time course of natural scene understanding. Psychol. Sci. 20, 464–472 (2009).
Gibson, J. J. The theory of affordances: (1979). In The people, place, and space reader, pp. 56–60 (Routledge, 2014).
Epstein, R. A. & Baker, C. I. Scene perception in the human brain. Annu. Rev. Vis. Sci. 5, 373–397 (2019).
Dilks, D. D. et al. The occipital place area is causally and selectively involved in scene perception. J. Neurosci. 33, 1331–1336 (2013).
Epstein, R. & Kanwisher, N. A cortical representation of the local visual environment. Nature 392, 598–601 (1998).
Epstein, R. A. Parahippocampal and retrosplenial contributions to human spatial navigation. Trends Cogn. Sci. 12, 388–396 (2008).
Silson, E. H. et al. Scene-selectivity and retinotopy in medial parietal cortex. Front. Hum. Neurosci. 10, 412 (2016)
Yoon, H. K. et al. A scene-selective region in the superior parietal lobule for visually guided navigation. Cereb. Cortex 35, bhaf082 (2025).
Kennedy, B. et al. A previously undescribed scene-selective site is the key to encoding ego-motion in naturalistic environments. eLife 13, RP91601 (2024).
Bonner, M. F. & Epstein, R. A. Coding of navigational affordances in the human visual system. Proc. Natl. Acad. Sci. 114, 4793–4798 (2017).
Silson, E. H. et al. A retinotopic basis for the division of high-level scene processing between lateral and ventral human occipitotemporal cortex. J. Neurosci. 35, 11921–11935 (2015).
Silson, E. H. et al. Evaluating the correspondence between face-, scene-, and object-selectivity and retinotopic organization within lateral occipitotemporal cortex. J. Vis. 16, 1–21 (2016).
Scrivener, C. L. et al. Retinotopy drives the variation in scene responses across visual field map divisions of the occipital place area. J. Vis. 24, 10 (2024).
Bonner, M. F. & Epstein, R. A. Computational mechanisms underlying cortical responses to the affordance properties of visual scenes. PLoS Comput. Biol. 14, e1006111 (2018)
Harel, A. et al. Early electrophysiological markers of navigational affordances in scenes. J. Cogn. Neurosci. 34, 397–410 (2022).
Dwivedi, K. et al. Visual features are processed before navigational affordances in the human brain. Sci. Rep. 14, 5573 (2024).
Bartnik, C. G. et al. Temporal misalignment in scene perception: Divergent representations of locomotive action affordances in human brain responses and DNNs. Neuroscience https://doi.org/10.1101/2025.03.14.642994 (2025).
Bartnik, C. G. et al. Representation of locomotive action affordances in human behavior, brains and deep neural networks. Neuroscience 122, e2414005122 (2024).
Groen, I. I. et al. Contributions of low-and high-level properties to neural processing of visual scenes in the human brain. Philos. Trans. R. Soc. B Biol. Sci. 372, 20160102 (2017).
Cichy, R. M. & Oliva, A. A M/EEG-fMRI fusion primer: resolving human brain responses in space and time. Neuron 107, 772–781 (2020).
Nili, H. et al. A toolbox for representational similarity analysis. PLoS Comput. Biol. 10, e1003553 (2014).
Oliva, A. & Torralba, A. Building the gist of a scene: the role of global image features in recognition. In Progress in Brain Research Ch. 2, Vol. 155, 23–36 (Elsevier, 2006).
Rice, G. E. et al. Low-level image properties of visual objects predict patterns of neural response across category-selective regions of the ventral visual pathway. J. Neurosci. 34, 8837–8844 (2014).
Watson, D. M. et al. Patterns of neural response in scene-selective regions of the human brain are affected by low-level manipulations of spatial frequency. NeuroImage 124, 107–117 (2016).
Wang, L. et al. Probabilistic maps of visual topography in human cortex. Cereb. Cortex 25, 3911–3931 (2015).
Kass, R. E. & Raftery, A. E. Bayes factors. J. Am. Stat. Assoc. 90, 773–795 (1995).
Teichmann, L. et al. The nature of neural object representations during dynamic occlusion. Cortex 153, 66–86 (2022).
Rouder, J. N. et al. Default Bayes factors for ANOVA designs. J. Math. Psychol. 56, 356–374 (2012).
Morey, R. D. et al. Why hypothesis tests are essential for psychological science: a comment on cumming. Psychol. Sci. 25, 1289–1290 (2014).
Oosterhof, N. N. et al. CoSMoMVPA: multi-modal multivariate pattern analysis of neuroimaging data in Matlab/GNU Octave. Front. Neuroinformatics 10, 27 (2016).
Wandell, B. A. et al. Visual field maps in human cortex. Neuron 56, 366–383 (2007).
Hebart, M. N. et al. The representational dynamics of task and object processing in humans. eLife 7, e32816 (2018).
Fei-Fei, L. et al. What do we perceive in a glance of a real-world scene?. J. Vis. 7, 10 (2007).
Malcolm, G. L. et al. Making sense of real-world scenes. Trends Cogn. Sci. 20, 843–856 (2016).
Nasr, S. et al. Thinking outside the box: rectilinear shapes selectively activate scene-selective cortex. J. Neurosci. 34, 6721–6735 (2014).
Watson, D. M. et al. Patterns of response to visual scenes are linked to the low-level properties of the image. NeuroImage 99, 402–410 (2014).
Kaiser, D. et al. Object vision in a structured world. Trends Cogn. Sci. 23, 672–685 (2019).
Cox, R. W. AFNI: software for analysis and visualization of functional magnetic resonance neuroimages. Comput. Biomed. Res. 29, 162–173 (1996).
Dale, A. M. et al. Cortical surface-based analysis. NeuroImage 9, 179–194 (1999).
Saad, Z. S. & Reynolds, R. C. SUMA. NeuroImage 62, 768–773 (2012).
Evans, J. W. et al. Separating slow BOLD from non-BOLD baseline drifts using multi-echo fMRI. NeuroImage 105, 189–197 (2015).
Kundu, P. et al. Differentiating BOLD and non-BOLD signals in fMRI time series using multi-echo EPI. NeuroImage 60, 1759–1770 (2012).
DuPre, E. et al. Multi-echo fMRI replication sample of autobiographical memory, prospection and theory of mind reasoning tasks. Sci. Data 3, 160116 (2016).
Steel, A. et al. Evaluating the efficacy of multi-echo ICA denoising on model-based fMRI. Neuroscience 264, 119723 (2022).
Chen, G. et al. Is the statistic value all we should care about in neuroimaging?. NeuroImage 147, 952–959 (2017).
Scrivener, C. L. When is simultaneous recording necessary? A guide for researchers considering combined EEG-fMRI. Front. Neurosci. 15, 636424 (2021).
Teichmann, L. An empirically driven guide on using Bayes factors for M/EEG decoding. Aperture Neuro 2, 1–10 (2022).
Smith, S. & Nichols, T. Threshold-free cluster enhancement: addressing problems of smoothing, threshold dependence and localisation in cluster inference. NeuroImage 44, 83–98 (2009).
Mensen, A. & Khatami, R. Advanced EEG analysis using threshold-free cluster-enhancement and non-parametric statistics. NeuroImage 67, 111–118 (2013).
Maris, E. & Oostenveld, R. Nonparametric statistical testing of EEG- and MEG-data. J. Neurosci. Methods 164, 177–190 (2007).
Acknowledgements
Supported by Biotechnology and Biological Sciences Research Council awards BB/V003887/1 to EHS and BB/V003917/1 to A.B.M., and the School of Philosophy, Psychology and Language Sciences Research Support Grant to J.A.T.
Author information
Authors and Affiliations
Contributions
E.Z., R.L., R.A. collected the fMRI and MEG data. E.Z., R.L., R.A., C.L.S. and E.H.S. analysed the fMRI and MEG data. C.L.S., J.A.T. and E.H.S. designed the behavioural experiment. Y.M. and J.A.T. collected the behavioural data. Y.M., J.A.T. and E.H.S. analysed the behavioural data. A.B.M. and E.H.S. collectively designed the fMRI and MEG experiments. E.H.S. wrote the first draft of the manuscript. All authors contributed to the final version of the manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Peer review
Peer review information
Nature Communications thanks Eva Patai and the other, anonymous, reviewer(s) for their contribution to the peer review of this work. A peer review file is available.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Zamboni, E., Lowndes, R., Aveyard, R. et al. Early spatiotemporal dynamics of navigational affordance coding in the dorsal visual cortex. Nat Commun (2026). https://doi.org/10.1038/s41467-025-68111-7
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41467-025-68111-7


