2020
DOI: 10.1101/2020.09.07.284455
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Visual speech differentially modulates beta, theta, and high gamma bands in auditory cortex

Abstract: Speech perception is a central component of social communication. While speech perception is primarily driven by sounds, accurate perception in everyday settings is also supported by meaningful information extracted from visual cues (e.g., speech content, timing, and speaker identity). Previous research has shown that visual speech modulates activity in cortical areas subserving auditory speech perception, including the superior temporal gyrus (STG), likely through feedback connections from the multisensory po… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
2
2

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 62 publications
0
3
0
Order By: Relevance
“…Oscillations in the beta band (16 to 30 Hz) have been shown to be closely intertwined with alpha oscillations during multisensory tasks requiring attentional inhibition (Friese et al, 2016; Ganesan et al, 2021). Beta oscillations are associated with neural networks responsible for error‐monitoring and decision making (Arnal & Giraud, 2012; Friese et al, 2016) and the maintenance of current sensorimotor information (Engel & Fries, 2010; Schneider et al, 2020).…”
Section: Introductionmentioning
confidence: 99%
“…Oscillations in the beta band (16 to 30 Hz) have been shown to be closely intertwined with alpha oscillations during multisensory tasks requiring attentional inhibition (Friese et al, 2016; Ganesan et al, 2021). Beta oscillations are associated with neural networks responsible for error‐monitoring and decision making (Arnal & Giraud, 2012; Friese et al, 2016) and the maintenance of current sensorimotor information (Engel & Fries, 2010; Schneider et al, 2020).…”
Section: Introductionmentioning
confidence: 99%
“…Experimental data could compare the power spectral density of EEG waves and the firing rates to see how they correspond to the simulations on the spectra of the LFP that we have done here to further shed light on the multisensory mechanism of audiovisual processing in the brain. Further studies could also tell us how visual speech affects the different oscillatory bands spatiotemporally across the auditory cortex and compare the results with the experimental data [65]. Integrating such neural data with behavioural data on speech comprehension in a computational model will further clarify the neural mechanisms of audiovisual speech processing.…”
Section: Discussionmentioning
confidence: 95%
“…Oscillations in the beta band (16 to 30 Hz) have been shown to be closely intertwined with alpha oscillations during tasks requiring attentional inhibition in multisensory tasks (Friese et al, 2016; Ganesan et al, 2021). Beta oscillations are associated with neural networks responsible for error-monitoring and decision making (Arnal & Giraud, 2012; Friese et al, 2016) and the maintenance of current sensorimotor information (Engel & Fries, 2010; Schneider et al, 2020).…”
Section: Introductionmentioning
confidence: 99%