Are visual and auditory stimuli processed by similar mechanisms in the human cerebral cortex? Images can be thought of as light energy modulations over two spatial dimensions, and low-level visual areas analyze images by decomposition into spatial frequencies. Similarly, sounds are energy modulations over time and frequency, and they can be identified and discriminated by the content of such modulations. An obvious question is therefore whether human auditory areas, in direct analogy to visual areas, represent the spectro-temporal modulation content of acoustic stimuli. To answer this question, we measured spectro-temporal modulation transfer functions of single voxels in the human auditory cortex with functional magnetic resonance imaging. We presented dynamic ripples, complex broadband stimuli with a drifting sinusoidal spectral envelope. Dynamic ripples are the auditory equivalent of the gratings often used in studies of the visual system. We demonstrate selective tuning to combined spectro-temporal modulations in the primary and secondary auditory cortex. We describe several types of modulation transfer functions, extracting different spectro-temporal features, with a high degree of interaction between spectral and temporal parameters. The overall low-pass modulation rate preference of the cortex matches the modulation content of natural sounds. These results demonstrate that combined spectro-temporal modulations are represented in the human auditory cortex, and suggest that complex signals are decomposed and processed according to their modulation content, the same transformation used by the visual system. auditory system ͉ cortical representation ͉ dynamic ripples A major goal of neuroscience is to discover mechanisms of encoding and recoding of sensory information in the brain. Many of these mechanisms have first been described in the visual system. It is unclear in how far auditory and visual processing is based on common principles, but a variety of intriguing similarities suggest that such principles exist: conservation of topography of the sensory epithelium (retinotopy, tonotopy) in low level structures, layout of higher structures in broad cortical processing streams (1, 2), common principles of perceptual organization such as grouping and illusory continuity (3), similar motion after effects (4, 5), and similar organization of memory (6). An established model of visual object recognition is a processing hierarchy in which successive stages are selective for increasingly complex features by combining the output of simpler feature detectors, starting with patchy spatial modulation frequency filters (7). In auditory neuroscience there is no consensus yet about the suitable set of low-level features. If the analogy to the visual system holds, then spectro-temporal modulation rate detectors are likely to be included in this set. Selectivity to other types of modulations has been studied in some detail in animal models and humans, especially for frequency and amplitude modulations (8-15). Selectivity for...
The present study investigates the acoustic basis of the hemispheric asymmetry for the processing of speech and music. Experiments on this question ideally involve stimuli that are perceptually unrelated to speech and music, but contain acoustic characteristics of both. Stimuli in previous studies were derived from speech samples or tonal sequences. Here we introduce a new class of noise-like sound stimuli with no resemblance of speech or music that permit independent parametric variation of spectral and temporal acoustic complexity. Using these stimuli in a functional MRI experiment, we test the hypothesis of a hemispheric asymmetry for the processing of spectral and temporal sound structure by seeking cortical areas in which the blood oxygen level dependent (BOLD) signal covaries with the number of simultaneous spectral components (spectral complexity) or the temporal modulation rate (temporal complexity) of the stimuli. BOLD-responses from the left and right Heschl's gyrus (HG) and part of the right superior temporal gyrus covaried with the spectral parameter, whereas covariation analysis for the temporal parameter highlighted an area on the left superior temporal gyrus. The portion of superior temporal gyrus in which asymmetrical responses are apparent corresponds to the antero-lateral auditory belt cortex, which has been implicated with spectral integration in animal studies. Our results support a similar function of the anterior auditory belt in humans. The findings indicate that asymmetrical processing of complex sounds in the cerebral hemispheres does not depend on semantic, but rather on acoustic stimulus characteristics.
. A part of the auditory system automatically detects changes in the acoustic environment. This preattentional process has been studied extensively, yet its cerebral origins have not been determined with sufficient accuracy to allow comparison to established anatomical and functional parcellations. Here we used event-related functional MRI and EEG in a parametric experimental design to determine the cortical areas in individual brains that participate in the detection of acoustic changes. Our results suggest that automatic change processing consists of at least three stages: initial detection in the primary auditory cortex, detailed analysis in the posterior superior temporal gyrus and planum temporale, and judgment of sufficient novelty for the allocation of attentional resources in the mid-ventrolateral prefrontal cortex.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.