2022
DOI: 10.1523/jneurosci.2476-21.2022
|View full text |Cite
|
Sign up to set email alerts
|

Differential Auditory and Visual Phase-Locking Are Observed during Audio-Visual Benefit and Silent Lip-Reading for Speech Perception

Abstract: Speech perception in noisy environments is enhanced by seeing facial movements of communication partners. However, the neural mechanisms by which audio and visual speech are combined are not fully understood. We explore MEG phase-locking to auditory and visual signals in MEG recordings from 14 human participants (6 females, 8 males) that reported words from single spoken sentences. We manipulated the acoustic clarity and visual speech signals such that critical speech information is present in auditory, visual… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
13
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 19 publications
(17 citation statements)
references
References 59 publications
4
13
0
Order By: Relevance
“…Brodbeck, Hong, et al, 2018;Kulasingham et al, 2020). The fROIs for the lip movements involved parietal and occipital regions, in line with previous studies that sourcelocalized the neural tracking of lip movements (Aller et al, 2022;Bourguignon et al, 2020;Hauswald et al, 2018). Similar to Park et al (2016), we also observed neural tracking of lip movements in temporal regions (see Figure S1), but with less involvement of the primary visual cortex and prominent only in the single speaker condition.…”
Section: Neural Tracking Of Audiovisual Speechsupporting
confidence: 90%
See 2 more Smart Citations
“…Brodbeck, Hong, et al, 2018;Kulasingham et al, 2020). The fROIs for the lip movements involved parietal and occipital regions, in line with previous studies that sourcelocalized the neural tracking of lip movements (Aller et al, 2022;Bourguignon et al, 2020;Hauswald et al, 2018). Similar to Park et al (2016), we also observed neural tracking of lip movements in temporal regions (see Figure S1), but with less involvement of the primary visual cortex and prominent only in the single speaker condition.…”
Section: Neural Tracking Of Audiovisual Speechsupporting
confidence: 90%
“…Analogous to behavioral findings in Aller et al (2022), the benefit of lip movements showed high interindividual variability (see Figure 2C) and followed a bimodal distribution. Some individuals benefited massively from lip movements, while others showed only a small benefit or none at all.…”
Section: Benefit Of Lip Movementssupporting
confidence: 73%
See 1 more Smart Citation
“…The number of channels also determines whether noise-vocoded stimuli are intelligible: 16-channel noise-vocoded speech is clearly intelligible, even for naïve listeners, whereas 1-channel noise-vocoded speech sounds like noise to naïve listeners and is completely unintelligible when presented in isolation. 1-channel vocoded speech yields <5% word report accuracy when presented alone, but ~30% when combined with visual cues like lip movements [ 22 ], suggesting that noise-vocoded speech can be perceived as speech-like, even when unintelligible. Importantly, however, these manipulations reveal changes in speech intelligibility that are independent of amplitude modulations, since these are very similar for 1-channel and 16-channel speech [ 23 ].…”
Section: Introductionmentioning
confidence: 99%
“…Several studies suggest that older adults with high-frequency hearing loss in particular benefit from visual speech cues and show improved SiN performance (Altieri & Hudock, 2014;Hallam & Corney, 2014;Lidestam et al, 2014;Winneke & Phillips, 2011), while other studies clearly find a benefit to speech perception from audio-visual speech presentation, but independent of age and the degree of hearing loss and with considerable individual variability in the extent of this benefit (Başkent & Bazo, 2011;Rosemann & Thiel, 2018;Sommers et al, 2005;Tye-Murray et al, 2007). At the neural level, visual speech cues appear to enhance neural tracking of the speech envelope (Aller et al, 2022;Crosse et al, 2016;Micheli et al, 2018;Park et al, 2016) and restore early cortical tracking of speech presented in noise, complementing impaired auditory input (Atilgan et al, 2018;Crosse et al, 2015;Zion Golumbic et al, 2013). This audio-visual enhancement of neural speech tracking has also been observed in older individuals (Puschmann et al, 2019), although the relationship to SiN perception and comprehension has not yet been sufficiently established.…”
Section: Introductionmentioning
confidence: 99%