2021
DOI: 10.1101/2021.02.09.430299
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A linguistic representation in the visual system underlies successful lipreading

Abstract: There is considerable debate over how visual speech is processed in the absence of sound and whether neural activity supporting lipreading occurs in visual brain areas. Surprisingly, much of this ambiguity stems from a lack of behaviorally grounded neurophysiological findings. To address this, we conducted an experiment in which human observers rehearsed audiovisual speech for the purpose of lipreading silent versions during testing. Using a combination of computational modeling, electroencephalography, and si… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
22
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 14 publications
(25 citation statements)
references
References 79 publications
3
22
0
Order By: Relevance
“…This suggests a direct and comprehension-relevant link between the dynamics of the lip contour and spectral speech features (Campbell, 2008). Hence, a representation of acoustic features during silent lip reading may underlie the mapping of lip movements onto phonological units such as visemes, a form of language-specific representation emerging along visual pathways (Nidiffer et al, 2021; O’Sullivan et al, 2017).…”
Section: Discussionmentioning
confidence: 99%
“…This suggests a direct and comprehension-relevant link between the dynamics of the lip contour and spectral speech features (Campbell, 2008). Hence, a representation of acoustic features during silent lip reading may underlie the mapping of lip movements onto phonological units such as visemes, a form of language-specific representation emerging along visual pathways (Nidiffer et al, 2021; O’Sullivan et al, 2017).…”
Section: Discussionmentioning
confidence: 99%
“…A recent study using electrocorticography similarly demonstrated that medial occipital cortex exhibit reliable auditory envelope tracking in the absence of visual speech (Micheli et al, 2020). Other studies have suggested that visual cortex represents the unheard auditory speech during silent lipreading in the form of its amplitude envelope (Hauswald et al, 2018) and higher-level linguistic feature representations (Nidiffer et al, 2021; Suess et al, 2021). Correspondingly, we also found evidence of visual cortex tracking the unheard auditory speech envelope in silent lipreading (Fig.…”
Section: Discussionmentioning
confidence: 99%
“…Auditory prediction errors (and hence auditory phase-locking) arise in visual cortical areas when degraded speech sounds cannot accurately predict visual speech cues during audio-visual speech perception. The resulting prediction errors signal viseme information (Nidiffer et al, 2021) that can be used to update higher-level interpretations and support optimal speech perception when visual and auditory stimuli must be combined (Olasagasti et al, 2015).…”
Section: Cross-modal Prediction Of Audio and Visual Speech Signalsmentioning
confidence: 99%
See 1 more Smart Citation
“…They provided evidence for a linguistic representation in the visual cortex stemming from visemic information of speech, a process independent of auditory processing associated with lip reading (Nidiffer et al, 2021).…”
Section: Occlusion Of Lip Movements Impairs Tracking Of Higher-level Segmentational Features Especially In Challenging Listening Situatiomentioning
confidence: 99%