2014
DOI: 10.3389/fpsyg.2014.00727
|View full text |Cite
|
Sign up to set email alerts
|

Effect of attentional load on audiovisual speech perception: evidence from ERPs

Abstract: Seeing articulatory movements influences perception of auditory speech. This is often reflected in a shortened latency of auditory event-related potentials (ERPs) generated in the auditory cortex. The present study addressed whether this early neural correlate of audiovisual interaction is modulated by attention. We recorded ERPs in 15 subjects while they were presented with auditory, visual, and audiovisual spoken syllables. Audiovisual stimuli consisted of incongruent auditory and visual components known to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

10
79
2

Year Published

2015
2015
2024
2024

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 79 publications
(91 citation statements)
references
References 64 publications
10
79
2
Order By: Relevance
“…The results are consistent with the previous findings (Nahorna et al 2012) Navarra et al 2010;Alsius et al 2014). Experiment B shows that attentional mechanisms may intervene at the level of single audiovisual sources in an audiovisual speech scene, selectively increasing or decreasing the amount of fusion depending on the coherence of the attended source.…”
Section: Response Timesupporting
confidence: 82%
“…The results are consistent with the previous findings (Nahorna et al 2012) Navarra et al 2010;Alsius et al 2014). Experiment B shows that attentional mechanisms may intervene at the level of single audiovisual sources in an audiovisual speech scene, selectively increasing or decreasing the amount of fusion depending on the coherence of the attended source.…”
Section: Response Timesupporting
confidence: 82%
“…As demonstrated by McGurk and MacDonald (1976), lip-read context can change perceived sound identity, and when it does, it triggers an auditory MMN response when the illusory AV stimulus is embedded in a string of congruent AV stimuli (e.g., Colin, Radeau, Soquet, & Deltenre, 2004;Colin et al, 2002;Saint-Amour, De Sanctis, Molholm, Ritter, & Foxe, 2007). When sound onset is sudden and does not follow repeated presentations of standard sounds, it triggers an N1/P2 complex (a negative peak at 100 ms followed by a positive peak at $200 ms) and it is well-documented that amplitude and latency of both peaks are modulated by lip-read speech (e.g., Alsius, Möttönen, Sams, Soto-Faraco, & Tiippana, 2014;Baart, Stekelenburg, & Vroomen, 2014;Besle, Fort, Delpuech, & Giard, 2004;Frtusova, Winneke, & Phillips, 2013;Klucharev, Möttönen, & Sams, 2003;Stekelenburg, Maes, van Gool, Sitskoorn, & Vroomen, 2013;van Wassenhove, Grant, & Poeppel, 2005;Winneke & Phillips, 2011). Thus, studies measuring both the MMN and the N1/P2 peaks indicate that lip-reading affects sound processing within 200 to 250 ms after sound onset.…”
Section: Introductionmentioning
confidence: 99%
“…van Wassenhove et al 2005) were found to be substantially reduced (or even eliminated; Alsius et al 2014). …”
Section: Stimulus-based Effectsmentioning
confidence: 99%