Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

Distinct audio and visual accumulators co-activate motor preparation for multisensory detection

Abstract

Detecting targets in multisensory environments is an elemental brain function, but it is not yet known whether information from different sensory modalities is accumulated by distinct processes, and, if so, whether the processes are subject to separate decision criteria. Here we address this in two experiments (n = 22, n = 21) using a paradigm design that enables neural evidence accumulation to be traced through a centro-parietal positivity and modelled alongside response time distributions. Through analysis of both redundant (respond-to-either-modality) and conjunctive (respond-only-to-both) audio-visual detection data, joint neural–behavioural modelling, and a follow-up onset-asynchrony experiment, we found that auditory and visual evidence is accumulated in distinct processes during multisensory detection, and cumulative evidence in the two modalities sub-additively co-activates a single, thresholded motor process during redundant detection. These findings answer long-standing questions about information integration and accumulation in multisensory conditions.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: The main alternative processing architectures for multisensory signal detection, assuming auditory and visual sensory inputs.
Fig. 2: Audio-visual detection task and the resulting behaviour for Experiment 1.
Fig. 3: Predicted dynamics for a single accumulator of evidence for target presence (top), compared to the observed evidence accumulation dynamics indexed by the CPP (middle) and motor preparation dynamics indexed by left-hemisphere beta activity (~20 Hz; bottom).
Fig. 4: Model comparison.
Fig. 5: Results of Experiment 2.

Similar content being viewed by others

Data availability

The data reported in this manuscript are available via OSF at https://osf.io/64d8e/.

Code availability

The MATLAB analysis code is available via OSF at https://osf.io/64d8e/.

References

  1. Todd, J. W. Reaction to Multiple Stimuli (The Science Press, 1912).

  2. Hershenson, M. Reaction time as a measure of intersensory facilitation. J. Exp. Psychol. 63, 289–293 (1962).

    Article  PubMed  CAS  Google Scholar 

  3. Raab, D. H. Statistical facilitation of simple reaction times. Trans. N. Y. Acad. Sci. 24, 574–590 (1962).

    Article  PubMed  CAS  Google Scholar 

  4. Smith, P. L. & Ratcliff, R. Psychology and neurobiology of simple decisions. Trends Neurosci. 27, 161–168 (2004).

    Article  PubMed  CAS  Google Scholar 

  5. Gold, J. I. & Shadlen, M. N. The neural basis of decision making. Annu. Rev. Neurosci. 30, 535–574 (2007).

    Article  PubMed  CAS  Google Scholar 

  6. Miller, J. Divided attention: evidence for coactivation with redundant signals. Cogn. Psychol. 14, 247–279 (1982).

    Article  PubMed  CAS  Google Scholar 

  7. Otto, T. U. & Mamassian, P. Multisensory decisions: the test of a race model, its logic, and power. Multisens. Res. 30, 1–24 (2017).

    Article  Google Scholar 

  8. Mordkoff, J. T., Miller, J. & Roch, A. C. Absence of coactivation in the motor component: evidence from psychophysiological measures of target detection. J. Exp. Psychol. Hum. Percept. Perform. 22, 25–41 (1996).

    Article  PubMed  CAS  Google Scholar 

  9. Giard, M. H. & Peronnet, F. Auditory–visual integration during multimodal object recognition in humans: a behavioral and electrophysiological study. J. Cogn. Neurosci. 11, 473–490 (1999).

    Article  PubMed  CAS  Google Scholar 

  10. Miller, J., Ulrich, R. & Lamarre, Y. Locus of the redundant signal effect in bimodal divided attention: a neurophysiological analysis. Percept. Psychophys. 63, 555–562 (2001).

    Article  PubMed  CAS  Google Scholar 

  11. Molholm, S. et al. Multisensory auditory–visual interactions during early sensory processing in humans: a high-density electrical mapping study. Cogn. Brain Res. 14, 115–128 (2002).

    Article  Google Scholar 

  12. Molholm, S., Ritter, W., Javitt, D. C. & Foxe, J. J. Multisensory visual-auditory object recognition in humans: a high-density electrical mapping study. Cereb. Cortex 14, 452–465 (2004).

    Article  PubMed  Google Scholar 

  13. Molholm, S. et al. Audio-visual multisensory integration in superior parietal lobule revealed by human intracranial recordings. J. Neurophysiol. 96, 721–729 (2006).

    Article  PubMed  Google Scholar 

  14. Martuzzi, R. et al. Multisensory interactions within human primary cortices revealed by BOLD dynamics. Cereb. Cortex 17, 1672–1679 (2007).

    Article  PubMed  Google Scholar 

  15. Sperdin, H. F., Cappe, C., Foxe, J. J. & Murray, M. M. Early, low-level auditory–somatosensory multisensory interactions impact reaction time speed. Front. Integr. Neurosci. 3, 2 (2009).

    Article  PubMed  PubMed Central  Google Scholar 

  16. Mercier, M. R. & Cappe, C. The interplay between multisensory integration and perceptual decision making. NeuroImage 222, 116970 (2020).

    Article  PubMed  Google Scholar 

  17. Senkowski, D. & Engel, A. K. Multi-timescale neural dynamics for multisensory integration. Nat. Rev. Neurosci. 25, 625–642 (2024).

    Article  PubMed  CAS  Google Scholar 

  18. Otto, T. U. & Mamassian, P. Noise and correlations in parallel perceptual decision making. Curr. Biol. 22, 1391–1396 (2012).

    Article  PubMed  CAS  Google Scholar 

  19. Schwarz, W. A new model to explain the redundant signals effect. Percept. Psychophys. 46, 498–500 (1989).

    Article  PubMed  CAS  Google Scholar 

  20. Schwarz, W. Diffusion, superposition, and the redundant targets effect. J. Math. Psychol. 38, 504–520 (1994).

    Article  Google Scholar 

  21. Diederich, A. Intersensory facilitation of reaction time: evaluation of counter and diffusion coactivation models. J. Math. Psychol. 39, 197–215 (1995).

    Article  Google Scholar 

  22. Blurton, S. P., Greenlee, M. W. & Gondan, M. Multisensory processing of redundant information in go/no-go and choice responses. Atten. Percept. Psychophys. 76, 1212–1233 (2014).

    Article  PubMed  Google Scholar 

  23. Zehetleitner, M., Ratko-Dehnert, E. & Müller, H. J. Modeling violations of the race model inequality in bimodal paradigms: co-activation from decision and non-decision components. Front. Hum. Neurosci. 9, 119 (2015).

    Article  PubMed  PubMed Central  Google Scholar 

  24. O’Connell, R. G., Dockree, P. M. & Kelly, S. P. A supramodal accumulation-to-bound signal that determines perceptual decisions in humans. Nat. Neurosci. 15, 1729–1735 (2012).

    Article  PubMed  Google Scholar 

  25. Kelly, S. P. & O’Connell, R. G. Internal and external influences on the rate of sensory evidence accumulation in the human brain. J. Neurosci. 33, 19434–19441 (2013).

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  26. Steinemann, N. A., O’Connell, R. G. & Kelly, S. P. Decisions are expedited through multiple neural adjustments spanning the sensorimotor hierarchy. Nat. Commun. 9, 3627 (2018).

    Article  PubMed  PubMed Central  Google Scholar 

  27. Kelly, S. P., Corbett, E. A. & O’Connell, R. G. Neurocomputational mechanisms of prior-informed perceptual decision-making in humans. Nat. Hum. Behav. 5, 467–481 (2021).

    Article  PubMed  Google Scholar 

  28. O’Connell, R. G., Shadlen, M. N., Wong-Lin, K. & Kelly, S. P. Bridging neural and computational viewpoints on perceptual decision-making. Trends Neurosci. 41, 838–852 (2018).

    Article  PubMed  PubMed Central  Google Scholar 

  29. Afacan-Seref, K., Steinemann, N. A., Blangero, A. & Kelly, S. P. Dynamic interplay of value and sensory information in high-speed decision making. Curr. Biol. 28, 795–802.e6 (2018).

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  30. Corbett, E. A., Martinez-Rodriguez, L. A., Judd, C., O'Connell, R. G. & Kelly, S. P. Multiphasic value biases in fast-paced decisions. eLife 12, e67711 (2023).

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  31. Geuzebroek, A. C., Craddock, H., O’Connell, R. G. & Kelly, S. P. Balancing true and false detection of intermittent sensory targets by adjusting the inputs to the evidence accumulation process. eLife 12, e83025 (2023).

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  32. Kelly, S. P. & O’Connell, R. G. The neural processes underlying perceptual decision-making in humans: recent progress and future directions. J. Physiol. Paris 109, 27–37 (2015).

    Article  PubMed  Google Scholar 

  33. Pfurtscheller, G. & Lopes da Silva, F. H. Event-related EEG/MEG synchronization and desynchronization: basic principles. Clin. Neurophysiol. 110, 1842–1857 (1999).

    Article  PubMed  CAS  Google Scholar 

  34. Drugowitsch, J., DeAngelis, G. C., Klier, E. M., Angelaki, D. E. & Pouget, A. Optimal multisensory decision-making in a reaction-time task. eLife 3, e03005 (2014).

    Article  PubMed  PubMed Central  Google Scholar 

  35. Fetsch, C. R., Pouget, A., Deangelis, G. C. & Angelaki, D. E. Neural correlates of reliability-based cue weighting during multisensory integration. Nat. Neurosci. 15, 146–154 (2011).

    Article  PubMed  PubMed Central  Google Scholar 

  36. Raposo, D., Kaufman, M. T. & Churchland, A. K. A category-free neural population supports evolving demands during decision-making. Nat. Neurosci. 17, 1784–1792 (2014).

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  37. Hou, H., Zheng, Q., Zhao, Y., Pouget, A. & Gu, Y. Neural correlates of optional multisensory decision making under time-varying reliabilities with an invariant linear probabilistic population code. Neuron 104, 1010–1021 (2019).

    Article  PubMed  CAS  Google Scholar 

  38. Oshiro, T., Angelaki, D. E. & DeAngelis, G. C. A normalization model of multisensory integration. Nat. Neurosci. 14, 775–782 (2011).

    Article  Google Scholar 

  39. Carandini, M. & Heeger, D. J. Normalization as a canonical neural computation. Nat. Rev. Neurosci. 13, 51–62 (2011).

    Article  PubMed  PubMed Central  Google Scholar 

  40. Townsend, J. T. & Eidels, A. Workload capacity spaces: a unified methodology for response time measures of efficiency as workload is varied. Psychon. Bull. Rev. 18, 659–681 (2011).

    Article  PubMed  Google Scholar 

  41. O'Connell, R. G. & Kelly, S. P. Neurophysiology of human perceptual decision-making. Annu. Rev. Neurosci. 44, 495–516 (2021).

    Article  PubMed  CAS  Google Scholar 

  42. Twomey, D. M., Kelly, S. P. & O'Connell, R. G. Abstract and effector-selective decision signals exhibit qualitatively distinct dynamics before delayed perceptual reports. J. Neurosci. 36, 7346–7352 (2016).

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  43. Grogan, J. P., Rhys, W., Kelly, S. P. & O'Connell, R. G. Confidence is predicted by pre- and post-choice decision signal dynamics. Imaging Neurosci. 1, 1–23 (2023).

    Article  Google Scholar 

  44. Kiel, J. & Senkowski, D. Neural oscillations orchestrate multisensory processing. Neuroscientist 24, 609–626 (2018).

    Article  Google Scholar 

  45. Talsma, D., Senkowski, D., Soto-Faraco, S. & Woldorff, M. G. The multifaceted interplay between attention and multisensory integration. Trends Cogn. Sci. 14, 400–410 (2010).

    Article  PubMed  PubMed Central  Google Scholar 

  46. Shaw, L. H. et al. Operating in a multisensory context: assessing the interplay between multisensory reaction time facilitation and inter-sensory task-switching effects. Neuroscience 436, 122–138 (2020).

    Article  PubMed  CAS  Google Scholar 

  47. Mordkoff, J. T. & Yantis, S. Dividing attention between color and shape: evidence of coactivation. Percept. Psychophys. 53, 357–366 (1993).

    Article  PubMed  CAS  Google Scholar 

  48. Ulrich, R., Miller, J. & Schröter, H. Testing the race model inequality: an algorithm and computer programs. Behav. Res. Methods 39, 291–302 (2007).

    Article  PubMed  Google Scholar 

  49. Cappe, C., Thut, G., Romei, V. & Murray, M. M. Selective integration of auditory-visual looming cues by humans. Neuropsychologia 47, 1045–1052 (2009).

    Article  PubMed  Google Scholar 

  50. Krummenacher, J., Grubert, A. & Müller, H. J. Inter-trial and redundant-signals effects in visual search and discrimination tasks: separable pre-attentive and post-selective effects. Vis. Res. 50, 1382–1395 (2010).

    Article  PubMed  Google Scholar 

  51. Vrancken, L., Vermeulen, E., Germeys, F. & Verfaillie, K. Measuring facial identity and emotion integration using the redundancy gain paradigm. Atten. Percept. Psychophys. 81, 217–236 (2019).

    Article  PubMed  Google Scholar 

  52. Chandrasekaran, C., Lemus, L., Trubanova, A., Gondan, M. & Ghazanfar, A. A. Monkeys and humans share a common computation for face/voice integration. PLoS Comput. Biol. 7, e1002165 (2011).

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  53. Otto, T. U., Dassy, B. & Mamassian, P. Principles of multisensory behavior. J. Neurosci. 33, 7463–7474 (2013).

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  54. Pilly, P. K. & Seitz, A. R. What a difference a parameter makes: a psychophysical comparison of random dot motion algorithms. Vis. Res. 49, 1599–1612 (2009).

    Article  PubMed  Google Scholar 

  55. Huk, A. C. & Meister, M. L. R. Neural correlates and neural computations in posterior parietal cortex during perceptual decision-making. Front. Integr. Neurosci. 6, 86 (2012).

    Article  PubMed  PubMed Central  Google Scholar 

  56. Boubenec, Y., Lawlor, J., Górska, U., Shamma, S. & Englitz, B. Detecting changes in dynamic and complex acoustic environments. eLife 6, e24910 (2017).

    Article  PubMed  PubMed Central  Google Scholar 

  57. Ratcliff, R. Group reaction time distributions and an analysis of distribution statistics. Psychol. Bull. 86, 446–461 (1979).

    Article  PubMed  CAS  Google Scholar 

  58. Foxe, J. J. & Snyder, A. C. The role of alpha-band brain oscillations as a sensory suppression mechanism during selective attention. Front. Psychol. 2, 154 (2011).

    Article  PubMed  PubMed Central  Google Scholar 

  59. Wyart, V. & Sergent, C. The phase of ongoing EEG oscillations uncovers the fine temporal structure of conscious perception. J. Neurosci. 29, 12839–12841 (2009).

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  60. Tenke, C. E. & Kayser, J. Generator localization by current source density (CSD): implications of volume conduction and field closure at intracranial and scalp resolutions. Clin. Neurophysiol. 123, 2328–2345 (2012).

    Article  PubMed  PubMed Central  Google Scholar 

  61. Chandrasekaran, C., Blurton, S. P. & Gondan, M. Audiovisual detection at different intensities and delays. J. Math. Psychol. 91, 159–175 (2019).

    Article  PubMed  PubMed Central  Google Scholar 

  62. Colonius, H. & Diederich, A. Formal models and quantitative measures of multisensory integration: a selective overview. Eur. J. Neurosci. 51, 1161–1178 (2017).

    Article  Google Scholar 

  63. Norcia, A. M., Appelbaum, L. G., Ales, J. M., Cottereau, B. R. & Rossion, B. The steady-state visual evoked potential in vision research: a review. J. Vis. 15, 4 (2015).

    Article  PubMed  PubMed Central  Google Scholar 

  64. Loughnane, G. M. et al. Target selection signals influence perceptual decisions by modulating the onset and rate of evidence accumulation. Curr. Biol. 26, 496–502 (2016).

    Article  PubMed  CAS  Google Scholar 

  65. Nunez, M. D., Gosai, A., Vandekerckhove, J. & Srinivasan, R. The latency of a visual evoked potential tracks the onset of decision making. NeuroImage 15, 93–108 (2019).

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by a Career Development Award from Science Foundation Ireland (15/CDA/3591) and a Wellcome Trust Investigator Award In Science (219572/Z/19/Z), both to S.P.K. R.G.O. was supported by a European Research Council Consolidator Grant (IndDecision 865474). J.J.F. is supported by a centre grant from the Eunice Kennedy Shriver National Institute of Child Health and Human Development (P50 HD103536). Data collection (M.G.-R.) for the supplemental experiment reported in Supplementary Fig. 2 was supported by a grant from the National Institute of Mental Health to J.J.F. (RO1 MH85322). The funders had no role in study design, data collection and analysis, decision to publish or preparation of the manuscript.

Author information

Authors and Affiliations

Authors

Contributions

J.M.E. originally conceived the study, and developed it and designed the experiments with S.P.K. J.M.E. collected and analysed the data with guidance from S.P.K. and R.G.O. J.M.E. designed the models with guidance from S.P.K. M.G.-R. and J.J.F. designed the pulsed-stimulus multisensory detection experiment reported in Supplementary Fig. 2, and M.G.-R. collected those data. J.M.E., R.G.O. and S.P.K. interpreted the results and wrote the paper, with input from M.G.-R. and J.J.F.

Corresponding authors

Correspondence to John M. Egan or Simon P. Kelly.

Ethics declarations

Competing interests

The authors declare no competing interests.

Peer review

Peer review information

Nature Human Behaviour thanks Hans Colonius, Daniel Senkowski and the other anonymous, reviewer(s) for their contribution to the peer review of this work. Peer reviewer reports are available.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary Information

Supplementary Figs. 1–4 and Table 1.

Reporting Summary

Peer Review File

Supplementary Video 1

Demo video of stimuli for Experiment 1.

Supplementary Video 2

Demo video of stimuli for Experiment 2.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Egan, J.M., Gomez-Ramirez, M., Foxe, J.J. et al. Distinct audio and visual accumulators co-activate motor preparation for multisensory detection. Nat Hum Behav (2025). https://doi.org/10.1038/s41562-025-02280-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1038/s41562-025-02280-9

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing