The phenomenon of binaural interference, where binaural judgments of a high-frequency target stimulus are disrupted by the presence of a simultaneous low-frequency interferer, can largely be explained using principles of auditory grouping and segregation. Evidence for this relationship comes from a number of previous studies showing that the manipulation of simultaneous grouping cues such as harmonicity and onset synchrony can influence the strength of the phenomenon. In this study, it is shown that sequential grouping cues can also influence whether binaural interference occurs. Subjects indicated the lateral position of a high-frequency sinusoidally amplitude-modulated (SAM) tone containing an interaural time difference. Perceived lateral positions were reduced by the presence of a simultaneous diotic low-frequency SAM tone, but were largely restored when the interferer was "captured" in a stream of identical tones. A control condition confirmed that the effect was not due to peripheral adaptation. The data lend further support to the idea that binaural interference is affected by processes related to the perceptual organization of auditory information. Modifications to existing grouping-based models are proposed that may help account for binaural interference effects more successfully.
There have been conflicting reports concerning the importance of visual experience in the development of auditory localization mechanisms. We have examined the representation of auditory space in the superior colliculus of adult ferrets that were visually deprived by binocular eyelid suture from postnatal days 25-28, prior to natural eye opening, until the time of recording. This procedure attenuated the transmission of light by a factor of a least 20-25 and blurred the image so that, as long as the eyelids were still fused, the responses of visual units in the superficial layers of the superior colliculus were labile and very poorly tuned. After the eyelids were opened, the representation of the visual field in these layers appeared to be normal. Acoustically responsive units were, as usual, almost exclusively restricted to the deeper layers of the superior colliculus. However, unlike normal animals, where responses occurring only at stimulus onset predominate, most of these units exhibited sustained or multi-peaked discharge patterns. The degree of spatial tuning of individual units recorded from the normal and deprived groups of animals was not significantly different in either azimuth or elevation. Normally orientated maps of both sound azimuth and elevation were also found in the visually deprived ferrets. However, abnormalities were present in the topography and precision of these representations and consequently in their alignment with the overlying visual map. In particular, an increase was observed in the proportion of auditory units with spatially ambiguous receptive fields, in which the maximum response occurred at two distinct locations. These results indicate that patterned visual experience is not required for establishing at least a crude map of auditory space in the superior colliculus, but suggest that it may play a role in refining this representation during development.
Because of the slow speed of sound relative to light, acoustic and visual signals from a distant event often will be received asynchronously. Here, using acoustic signals with a robust cue to sound source distance, we show that judgments of perceived temporal alignment with a visual marker depend on the depth simulated in the acoustic signal. For distant sounds, a large delay of sound relative to vision is required for the signals to be perceived as temporally aligned. For nearer sources, the time lag corresponding to audiovisual alignment is smaller and scales at rate approximating the speed of sound. Thus, when robust cues to auditory distance are present, the brain can synchronize disparate audiovisual signals to external events despite considerable differences in time of arrival at the perceiver. This ability is functionally important as it allows auditory and visual signals to be synchronized to the external event that caused them.audiovisual interactions ͉ auditory distance perception ͉ auditory psychophysics S tudies of audiovisual temporal alignment generally have found that an auditory stimulus needs to be delayed to be perceptually aligned with a visual stimulus (1-7). This temporal offset, on the order of several tens of milliseconds, is thought to reflect the slower processing times for visual stimuli. This offset arises because acoustic transduction between the outer and inner ears is a direct mechanical process and is extremely fast at just 1 ms or less (8, 9), whereas, by contrast, phototransduction in the retina is a relatively slow photochemical process followed by several cascading neurochemical stages and lasts Ϸ50 ms (10-14). Thus, differential latencies between auditory and visual processing agree well with the common finding that auditory signals must lag visual signals by Ϸ40-50 ms if they are to be perceived as temporally aligned.Most studies of audiovisual alignment, however, are based on experiments done in the near field, meaning that auditory travel time is a negligible factor. Studies of audiovisual alignment conducted over greater distances have examined whether the brain can compensate for the slow travel time of sound, but these have produced contradictory results (15-17). Here, we test whether subjective audiovisual alignment reflects only the relatively stable internal latency differences that are well documented or whether knowledge of the external distance of an auditory source can be used to compensate for the slow travel time of sound relative to light. In the experiments presented here, we use a powerful cue to auditory source distance, the ratio of direct-to-reverberant energy (18), to vary the apparent distance of the acoustic signal's origin. By manipulating this cue, we find that increases in sound source distance cause sounds to be perceptually aligned with earlier visual events. When large auditory distances are simulated, the auditory lag required for vision to be aligned with sound becomes exaggerated far beyond what can be accounted for by differential neural laten...
Changing the shape of the outer ear using small in-ear molds degrades sound localization performance consistent with the distortion of monaural spectral cues to location. It has been shown recently that adult listeners re-calibrate to these new spectral cues for locations both inside and outside the visual field. This raises the question as to the teacher signal for this remarkable functional plasticity. Furthermore, large individual differences in the extent and rate of accommodation suggests a number of factors may be driving this process. A training paradigm exploiting multi-modal and sensory-motor feedback during accommodation was examined to determine whether it might accelerate this process. So as to standardize the modification of the spectral cues, molds filling 40% of the volume of each outer ear were custom made for each subject. Daily training sessions for about an hour, involving repetitive auditory stimuli and exploratory behavior by the subject, significantly improved the extent of accommodation measured by both front-back confusions and polar angle localization errors, with some improvement in the rate of accommodation demonstrated by front-back confusion errors. This work has implications for both the process by which a coherent representation of auditory space is maintained and for accommodative training for hearing aid wearers.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.