2002
DOI: 10.1016/s0926-6410(02)00053-8
|View full text |Cite
|
Sign up to set email alerts
|

Processing of changes in visual speech in the human auditory cortex

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

17
117
6
1

Year Published

2005
2005
2022
2022

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 125 publications
(141 citation statements)
references
References 39 publications
17
117
6
1
Order By: Relevance
“…This suggests that phonetic recalibration is -like selective speech adaptation (Samuel & Kat, 1998) -a low-level process that occurs in an automatic fashion. This finding is in line with other research that demonstrates that the on-line integration of auditory and visual speech is automatic (Besle et al, 2004;Calvert & Campbell, 2003;Campbell et al, 2001;Colin et al, 2002;Massaro, 1987;McGurk & MacDonald, 1976;Möttönen et al, 2002;Näätänen, 2001;Soto-Faraco et al, 2004).…”
Section: -Discussionsupporting
confidence: 93%
See 2 more Smart Citations
“…This suggests that phonetic recalibration is -like selective speech adaptation (Samuel & Kat, 1998) -a low-level process that occurs in an automatic fashion. This finding is in line with other research that demonstrates that the on-line integration of auditory and visual speech is automatic (Besle et al, 2004;Calvert & Campbell, 2003;Campbell et al, 2001;Colin et al, 2002;Massaro, 1987;McGurk & MacDonald, 1976;Möttönen et al, 2002;Näätänen, 2001;Soto-Faraco et al, 2004).…”
Section: -Discussionsupporting
confidence: 93%
“…Some have reported that visual speech may affect auditory processing as early as the auditory cortex (Calvert et al, 1997;Colin et al, 2002;Möttönen, Krause, Tiippana, & Sams, 2002;Pekkola et al, 2005;Sams et al, 1991). The interaction has been found to occur between 150 and 250 ms using the mismatch negativity paradigm (Colin et al, 2002;Möttönen et al, 2002;Sams et al, 1991), while others have reported that as early as 100 ms, the auditory N1 component is attenuated and speeded up when auditory speech is accompanied by lipread information (Besle et al, 2004;van Wassenhove et al, 2005), possibly because visual speech predicts when a sound is going to occur Vroomen & Stekelenburg, 2010). Notably though, to date it is not known whether auditory speech also affects visual processing of lipread speech.…”
Section: -Experiments 1 Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Evoked-potential data resolve the temporal scale of neural response with excellent fidelity and enable us to see when visual stimuli impact auditory processing. From this literature, we know that seeing lip movements while listening to speech speeds the latency of peaks as early as 10 msec poststimulation in human auditory brain stem responses (Musacchia et al 2006) and from 40 to 200 msec poststimulation in cortical EPs (Mottonen et al 2002;van Wassenhove et al 2005). Recent cortical data also show smaller N1 peak amplitudes at approximately 120 to 140 msec to AV speech (Besle et al 2004) and nonspeech stimuli (Stekelenburg & Vroomen 2007).…”
Section: Introductionmentioning
confidence: 99%
“…Th e detection of mismatch between auditory speech and visual articulation has been documented in several electrophysiological studies, with a mismatchrelated ERP response found over frontal-lateral electrodes (Bristow et al, 2009;Kushnerenko et al, 2008; see also Mott onen, Krause, Tiippana, & Sams, 2002;Ross, Saint-Amour, Leavitt , Javitt , & Foxe, 2007). Crucially, Kushnerenko and colleagues (2008) reported that fi ve-month-olds already show the audiovisual, event-related mismatch response (AVMMR) to a confl icting combination of cues (VbaAga) and that this response is absent in trials where both cues can be fused into a single percept (VgaAba).…”
Section: Attention To Av Mismatch Is Associated With the Neural Mismamentioning
confidence: 94%