2017
DOI: 10.1523/jneurosci.2926-16.2017
|View full text |Cite
|
Sign up to set email alerts
|

Being First Matters: Topographical Representational Similarity Analysis of ERP Signals Reveals Separate Networks for Audiovisual Temporal Binding Depending on the Leading Sense

Abstract: In multisensory integration, processing in one sensory modality is enhanced by complementary information from other modalities. Intersensory timing is crucial in this process because only inputs reaching the brain within a restricted temporal window are perceptually bound. Previous research in the audiovisual field has investigated various features of the temporal binding window, revealing asymmetries in its size and plasticity depending on the leading input: auditory–visual (AV) or visual–auditory (VA). Here,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

5
21
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 32 publications
(26 citation statements)
references
References 82 publications
5
21
0
Order By: Relevance
“…First, the lower reliability of behavioural measures for vision‐leading pairs (in our case, seen as a tendency for wider confidence intervals for perceived synchrony boundaries) might have blurred the detection of phase alignment in pre‐stimulus EEG. This larger perceptual variability for vision‐first events has been noted in many previous psychophysical studies, when asymmetries have been sought for (Yarrow et al., ), and is probably related to the well known lower sensitivity in asynchrony detection when vision leads audition (Cecere, Gross, Willis, & Thut, ; Conrey & Pisoni, ; Stevenson & Wallace, ; Van Wassenhove et al., ). A second, perhaps complementary, explanation for the asymmetry in our results, is that synchrony perception of different modality orders might rely on distinct, partially non‐overlapping (modality specific) neural processes (Cecere et al., , ; Thorne & Debener, ).…”
Section: Discussionsupporting
confidence: 52%
See 1 more Smart Citation
“…First, the lower reliability of behavioural measures for vision‐leading pairs (in our case, seen as a tendency for wider confidence intervals for perceived synchrony boundaries) might have blurred the detection of phase alignment in pre‐stimulus EEG. This larger perceptual variability for vision‐first events has been noted in many previous psychophysical studies, when asymmetries have been sought for (Yarrow et al., ), and is probably related to the well known lower sensitivity in asynchrony detection when vision leads audition (Cecere, Gross, Willis, & Thut, ; Conrey & Pisoni, ; Stevenson & Wallace, ; Van Wassenhove et al., ). A second, perhaps complementary, explanation for the asymmetry in our results, is that synchrony perception of different modality orders might rely on distinct, partially non‐overlapping (modality specific) neural processes (Cecere et al., , ; Thorne & Debener, ).…”
Section: Discussionsupporting
confidence: 52%
“…This larger perceptual variability for vision‐first events has been noted in many previous psychophysical studies, when asymmetries have been sought for (Yarrow et al., ), and is probably related to the well known lower sensitivity in asynchrony detection when vision leads audition (Cecere, Gross, Willis, & Thut, ; Conrey & Pisoni, ; Stevenson & Wallace, ; Van Wassenhove et al., ). A second, perhaps complementary, explanation for the asymmetry in our results, is that synchrony perception of different modality orders might rely on distinct, partially non‐overlapping (modality specific) neural processes (Cecere et al., , ; Thorne & Debener, ). This explanation might not seem parsimonious at first, but its plausibility is supported after recent findings of analogous asymmetries in synchrony processing across different modality orders in other electrophysiological studies (Kaganovich & Schumaker, ; Kösem et al., ).…”
Section: Discussionsupporting
confidence: 52%
“…It has been suggested that the A-leading cues served as an alerting mechanism to boost visual processing immediately in crossmodal integration; in contrast, V-leading cues promoted the auditory system to make predictions about forthcoming auditory events (Thorne and Debener, 2014). Thus, our results suggested that auditory signs could activate the visual cortex faster to prepare for the incoming visual processing (Cecere et al, 2017) and improve the detection of facial emotions for MADs.…”
Section: Discussionmentioning
confidence: 62%
“…Under crossmodal integration, the anterior stimuli played the priming role (Stekelenburg and Vroomen, 2007), which prompted sensory systems to enter the readiness state and to speed up the response to later targets. Such prediction was likely related to the sensory modality of the cues (Cecere et al, 2017). Based on the prediction and promotion effect of the emotional cues during the crossmodal mode, Experiment 2 discussed whether the emotional recognition disorder of MADs could be improved through the designs of sequential presentation of crossmodal cues.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation