2018
DOI: 10.1101/504845
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The neural dynamics of hierarchical Bayesian inference in multisensory perception

Abstract: Transforming the barrage of sensory signals into a coherent multisensory percept relies on solving the binding problemdeciding whether signals come from a common cause and should be integrated, or instead be segregated. Human observers typically arbitrate between integration and segregation consistent with Bayesian Causal Inference, but the neural mechanisms remain poorly understood. We presented observers with audiovisual sequences that varied in the number of flashes and beeps. Combining Bayesian modelling a… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
7
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 82 publications
(123 reference statements)
1
7
0
Order By: Relevance
“…Critically, the neural data challenge the dichotomy between fusion and CI as separate accounts of multisensory perception. Rather, they support the notion that perception is a hierarchical process relying on the explicit representations of distinct multisensory computations orchestrated over several brain regions (Rohe and Noppeney, 2015b;Rohe et al, 2019;Kayser and Shams, 2015). Our results suggest that representations as predicted by each model coexist and unveil the functional hierarchy of the underlying computations in distinct regions and over different timescales.…”
Section: Temporal Hierarchy Of Multisensory Computationssupporting
confidence: 78%
“…Critically, the neural data challenge the dichotomy between fusion and CI as separate accounts of multisensory perception. Rather, they support the notion that perception is a hierarchical process relying on the explicit representations of distinct multisensory computations orchestrated over several brain regions (Rohe and Noppeney, 2015b;Rohe et al, 2019;Kayser and Shams, 2015). Our results suggest that representations as predicted by each model coexist and unveil the functional hierarchy of the underlying computations in distinct regions and over different timescales.…”
Section: Temporal Hierarchy Of Multisensory Computationssupporting
confidence: 78%
“…Instead, our Bayesian modelling analysis confirmed that from 350 ms onwards, the brain integrates audiovisual signals weighted by their bottom-up reliability and top-down task relevance into spatial priority maps [36,37] that take into account the probabilities of the different causal structures consistent with Bayesian causal inference. The spatial priority maps were behaviourally relevant for guiding spatial orienting and actions, as indicated by the correlation between the neural and behavioural audiovisual weight indices, which progressively increased from 100 ms and culminated at about 300–400 ms. Two recent studies have also demonstrated such a temporal evolution of Bayesian causal inference in an audiovisual temporal numerosity judgement task [38] and an audiovisual rate categorisation task [39].…”
Section: Discussionmentioning
confidence: 98%
“…How does the brain determine whether signals arise from common or independent causes based on spatiotemporal correspondence cues? Previous research Noppeney, 2015b, 2016;Aller and Noppeney, 2019;Cao et al, 2019;Rohe et al, 2019) could not address this critical question because observers' implicit causal inference was inherently correlated with the physical correspondence cues (e.g., spatial, temporal, or rate). To define the neural systems underlying causal inference, we need to dissociate the decisional outcome of observers' causal inference from the underlying physical correspondence cues such as, for example, the spatial congruency of audiovisual signals.…”
Section: Introductionmentioning
confidence: 99%
“…At the neural level, functional magnetic resonance imaging (fMRI), magnetoencephalography, and electroencephalography research Noppeney, 2015b, 2016;Aller and Noppeney, 2019;Cao et al, 2019;Rohe et al, 2019) has recently suggested that the brain flexibly combines sensory signals by dynamically encoding multiple perceptual estimates at distinct cortical levels along the visual and auditory processing hierarchies. For instance, early (50-100 ms) neural processes in primary sensory areas encoded predominantly the spatial locations independently for auditory and visual signals, while later processes (100-200 ms) in posterior intraparietal sulcus (IPS) (IPS1-2) formed spatial representations by combining audiovisual signals.…”
Section: Introductionmentioning
confidence: 99%