2022
DOI: 10.1152/jn.00164.2021
|View full text |Cite
|
Sign up to set email alerts
|

Visual cortex responds to sound onset and offset during passive listening

Abstract: Sounds enhance our ability to detect, localize, and respond to co-occurring visual targets. Research suggests that sounds improve visual processing by resetting the phase of ongoing oscillations in visual cortex. However, it remains unclear what information is relayed from the auditory system to visual areas and if sounds modulate visual activity even in the absence of visual stimuli (e.g., during passive listening). Using intracranial electroencephalography (iEEG) in humans, we examined the sensitivity of vis… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

5
6
2

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(13 citation statements)
references
References 110 publications
5
6
2
Order By: Relevance
“…Overall, our results align with recent evidence reporting that the visual cortex can contribute to auditory information processing in sighted individuals (Brang et al, 2022; Martinelli et al, 2020; Seydell-Greenwald et al, 2021; Vetter et al, 2014). Here, we observed that the visual cortex is more engaged in processing when speech signal is intelligible and clear (i.e., presented in quiet).…”
Section: Discussionsupporting
confidence: 92%
See 1 more Smart Citation
“…Overall, our results align with recent evidence reporting that the visual cortex can contribute to auditory information processing in sighted individuals (Brang et al, 2022; Martinelli et al, 2020; Seydell-Greenwald et al, 2021; Vetter et al, 2014). Here, we observed that the visual cortex is more engaged in processing when speech signal is intelligible and clear (i.e., presented in quiet).…”
Section: Discussionsupporting
confidence: 92%
“…The contribution of visual cortices in language processing is not limited to visual or audiovisual representations of spoken language. There is scattered evidence that the early visual cortex is also active during purely auditory stimulation (Brang et al, 2022; Petro, Paton, & Muckli, 2017; Vetter, Smith, & Muckli, 2014) and while listening to spoken language (e.g., Martinelli et al, 2020; Seydell-Greenwald, Wang, Newport, Bi, & Striem-Amit, 2021; Wolmetz, Poeppel, & Rapp, 2011). Importantly, such activations cannot be explained by semantic-based imagery alone but rather seem to reflect genuine responses to language input; in fact, the visual cortex also responds to abstract concepts with low imaginability rates (Seydell-Greenwald et al, 2021).…”
Section: Introductionmentioning
confidence: 99%
“…However, without concurrent physiological recording, it is difficult to generalize from “absence of perceptual modulations” to “absence of physiological entrainment.” It is at least plausible that alpha-band oscillatory activity could synchronize weakly to auditory stimulation without inducing robust modulations in behavior. The suggestive but ultimately weak and inconsistent results of Experiments 1 and 4 could be consistent with such weak synchronization, but we note that Brang et al (2022) failed to observe neural oscillations in visual cortex in response to a similar rhythmic auditory stimulus.…”
Section: Discussionsupporting
confidence: 45%
“…There is always the possibility that our failure to observe auditory to visual entrainment is a matter of the choice of stimulus, task, or other design element unrelated to the process in question and that a more effective design could succeed in inducing or revealing visual perceptual oscillations where we have failed. However, we expect these results (the failure to induce stable oscillations in visual perception) to generalize to other types of auditory entraining stimuli and similar visual tasks, particularly given corroborating evidence of failures to induce either auditory perceptual oscillations or visual cortical oscillations in this range of frequencies using other types of entraining stimuli (Farahbod et al, 2020; and Brang et al, 2022; respectively).…”
Section: Discussionmentioning
confidence: 55%
See 1 more Smart Citation