The use of functional magnetic resonance imaging (fMRI) to explore central auditory function may be compromised by the intense bursts of stray acoustic noise produced by the scanner whenever the magnetic resonance signal is read out. We present results evaluating the use of one method to reduce the effect of the scanner noise: "sparse" temporal sampling. Using this technique, single volumes of brain images are acquired at the end of stimulus and baseline conditions. To optimize detection of the activation, images are taken near to the maxima and minima of the hemodynamic response during the experimental cycle. Thus, the effective auditory stimulus for the activation is not masked by the scanner noise. In experiment 1, the course of the hemodynamic response to auditory stimulation was mapped during continuous task performance. The mean peak of the response was at 10.5 sec after stimulus onset, with little further change until stimulus offset. In experiment 2, sparse imaging was used to acquire activation images. Despite the fewer samples with sparse imaging, this method successfully delimited broadly the same regions of activation as conventional continuous imaging. However, the mean percentage MR signal change within the region of interest was greater using sparse imaging. Auditory experiments that use continuous imaging methods may measure activation that is a result of an interaction between the stimulus and task factors (e.g., attentive effort) induced by the intense background noise. We suggest that sparse imaging is advantageous in auditory experiments as it ensures that the obtained activation depends on the stimulus alone.
We report a systematic relationship between sound-frequency tuning and sensitivity to interaural time delays for neurons in the midbrain nucleus of the inferior colliculus; neurons with relatively low best frequencies (BFs) showed response peaks at long delays, whereas neurons with relatively high BFs showed response peaks at short delays. The consequence of this relationship is that the steepest region of the function relating discharge rate to interaural time delay (ITD) fell close to midline for all neurons irrespective of BF. These data provide support for a processing of the output of coincidence detectors subserving low-frequency sound localization in which the location of a sound source is determined by the activity in two broad, hemispheric spatial channels, rather than numerous channels tuned to discrete spatial positions.
We have investigated the responses of neurones in the guinea-pig superior colliculus to combinations of visual and auditory stimuli. When these stimuli were presented separately, some of these neurones responded only to one modality, others to both and a few neurones reliably to neither. To bimodal stimulation, many of these neurones exhibited some form of cross-modality interaction, the degree and nature of which depended on the relative timing and location of the two stimuli. Facilitatory and inhibitory interactions were observed and, occasionally, both effects were found in the same neurone at different inter-stimulus intervals. Neurones whose responses to visual stimuli were enhanced by an auditory stimulus were found in the superficial layers. Although visual-enhanced and visual-depressed auditory neurones were found throughout the deep layers, the majority of them were recorded in the stratum griseum profundum. Neurones that responded to both visual and auditory stimuli presented separately and gave enhanced or depressed responses to bimodal stimulation were found throughout the deep layers, but were concentrated in the stratum griseum intermediale and extended into the stratum opticum.
The organisation of guinea pig auditory cortex was studied by combining histological methods with microelectrode mapping. This allowed the location of seven auditory areas to be determined in relation to the visual and primary somatosensory areas. The auditory areas were identified by single-unit recordings and their borders defined by evoked potential mapping. The visual areas were identified by their relatively high densities of myelinated fibres, while the primary somatosensory cortex was identified by its characteristic barrels of high cytochrome oxidase (CYO) activity in layer IV. The auditory region had moderate levels of CYO and myelin staining. When staining was optimal, there was a clear edge to the moderate CYO activity, which apparently corresponds to the dorsal border of the primary auditory area (AI) and the other core field that lies dorsocaudal to it (DC). Thus the primary somatosensory area and the visual and auditory regions were separated from each other by a region with lower levels of CYO and myelin staining. The ventral borders of AI and DC could not be determined histologically as there were no sharp transitions in the levels of CYO or myelin staining. The two core areas were partially surrounded by belt areas. The dorsorostral belt and most of the belt around DC responded more strongly to broad-band stimuli than pure tones, while the ventrorostral belt, small field and a belt zone ventral to the rostral part of DC responded better to pure tones. Units in the small field (S) typically had higher thresholds and broader tuning to pure tones than AI, while units in the ventrorostral belt typically had longer onset latencies and gave more sustained responses than units in AI.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.