2020
DOI: 10.1101/2020.04.15.043174
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Auditory features modelling reveals sound envelope representation in striate cortex

Abstract: Primary visual cortex is no longer considered exclusively visual in its function. Proofs that its activity plays a role in multisensory processes have accrued. Here we provide evidence that, in absence of retinal input, V1 maps sound envelope information. We modeled amplitude changes occurring at typical speech envelope timescales of four hierarchically-organized categories of natural (or synthetically derived) sounds. Using functional magnetic resonance, we assessed whether sound amplitude variations were rep… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
10
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
2

Relationship

3
2

Authors

Journals

citations
Cited by 6 publications
(11 citation statements)
references
References 136 publications
1
10
0
Order By: Relevance
“…We took advantage of computational modeling to extract a collection of movie-related features. Specifically, two sets of low-level features were defined, one extracted from the auditory stream (spectral 26 and sound envelope 53 properties to account for frequency- and amplitude-based modulations) and one from the visual stream (set of static Gabor-like filters -GIST 54,55 and motion energy information based on their spatiotemporal integration 56 ). Moreover, a second set of high-level features was modeled based on a manual tagging of natural and artificial categories occurring both in the auditory and visual streams, as well as word embeddings from subtitles using the word2vec algorithm 57 .…”
Section: Methodsmentioning
confidence: 99%
“…We took advantage of computational modeling to extract a collection of movie-related features. Specifically, two sets of low-level features were defined, one extracted from the auditory stream (spectral 26 and sound envelope 53 properties to account for frequency- and amplitude-based modulations) and one from the visual stream (set of static Gabor-like filters -GIST 54,55 and motion energy information based on their spatiotemporal integration 56 ). Moreover, a second set of high-level features was modeled based on a manual tagging of natural and artificial categories occurring both in the auditory and visual streams, as well as word embeddings from subtitles using the word2vec algorithm 57 .…”
Section: Methodsmentioning
confidence: 99%
“…Importantly, such activations cannot be explained by semantic-based imagery alone but rather seem to reflect genuine responses to language input; in fact, the visual cortex also responds to abstract concepts with low imaginability rates (Seydell-Greenwald et al, 2021). Overall, this evidence highlights a putative role of the visual cortex in mapping temporal modulations of incoming sounds, especially in the absence of competing retinal input (Martinelli et al, 2020;Vetter et al, 2014). However, the exact role of the visual cortex in the hierarchy of speech processing remains unclear.…”
Section: Introductionmentioning
confidence: 92%
“…Visual ROIs were selected for the left and right hemispheres and included primary (V1; Calcarine sulcus) and secondary (V2, Lingual gyrus) visual cortex, defined as the 'S_calcarine' and the 'G_oc-tem_med-Lingual' scouts in the atlas, correspondingly. These visual ROIs were selected based on recently reported evidence of their involvement in speech processing not only in blind but also sighted 13 individuals, albeit to a lower extent (Martinelli et al, 2020;Petro et al, 2017;Seydell-Greenwald et al, 2021;Van Ackeren, Barbero, Mattioni, Bottini, & Collignon, 2018;Vetter et al, 2020Vetter et al, , 2014. Upon the ROIs creation, their time-series were extracted and submitted to the analysis.…”
Section: Source Estimationmentioning
confidence: 99%
See 2 more Smart Citations