2018
DOI: 10.1111/ejn.13992
|View full text |Cite
|
Sign up to set email alerts
|

Electrocorticography reveals continuous auditory and visual speech tracking in temporal and occipital cortex

Abstract: During natural speech perception, humans must parse temporally continuous auditory and visual speech signals into sequences of words. However, most studies of speech perception present only single words or syllables. We used electrocorticography (subdural electrodes implanted on the brains of epileptic patients) to investigate the neural mechanisms for processing continuous audiovisual speech signals consisting of individual sentences. Using partial correlation analysis, we found that posterior superior tempor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

7
29
1

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 27 publications
(39 citation statements)
references
References 71 publications
7
29
1
Order By: Relevance
“…Our data extend those findings to audiovisual speech. Of note, although our results highlight right hemispheric regions, it has been amply demonstrated that the left hemisphere, and especially the left superior temporal cortex, also participates in audiovisual speech integration ( 10 , 41 43 ).…”
Section: Discussioncontrasting
confidence: 54%
“…Our data extend those findings to audiovisual speech. Of note, although our results highlight right hemispheric regions, it has been amply demonstrated that the left hemisphere, and especially the left superior temporal cortex, also participates in audiovisual speech integration ( 10 , 41 43 ).…”
Section: Discussioncontrasting
confidence: 54%
“…This phase alignment in turn determines systematic, stimulus-locked variations in neuronal activity, as indexed by fluctuations in broadband high-frequency activity. It was further shown that visual speech gestures enhance intelligibility by facilitating auditory cortical entrainment to the speech stream (Crosse, Butler and Lalor, 2015;Perrodin et al, 2015;Park et al, 2016Park et al, , 2018Di Liberto et al, 2018;Micheli et al, 2018). Here, we used iEEG recordings for a more direct examination of the neurophysiological mechanisms underlying visual enhancement of auditory cortical speech processing.…”
Section: Discussionmentioning
confidence: 99%
“…For example, electrophysiological research in nonhuman primates shows that neural activity in primary auditory cortex is modulated by multisensory cues, including from motor/haptic (Lakatos, Chen, O'Connell, Mills, & Schroeder, 2007) and visual (Lakatos, Karmos, Mehta, Ulbert, & Schroeder, 2008) inputs. In humans, regions of posterior STS also represent visual amplitude envelope information (Micheli et al, 2018). Non-auditory information may arrive at primary auditory cortex through nonspecific thalamic pathways (Schroeder, Lakatos, Kajikawa, Partan, & Puce, 2008) and/or cortico-cortical connections from visual to auditory cortex (Arnal & Giraud, 2012).…”
Section: Early Integration Models Of Audiovisual Integrationmentioning
confidence: 99%