We examined the frame of reference of auditory responses in the inferior colliculus in monkeys fixating visual stimuli at different locations. Eye position modulated the level of auditory responses in 33% of the neurons we encountered, but it did not appear to shift their spatial tuning. The effect of eye position on auditory responses was substantial-comparable in magnitude to that of sound location. The eye position signal appeared to interact with the auditory responses in at least a partly multiplicative fashion. We conclude that the representation of sound location in primate IC is distributed and that the frame of reference is intermediate between head- and eye-centered coordinates. The information contained in these neurons appears to be sufficient for later neural stages to calculate the positions of sounds with respect to the eyes.
Taken together with emerging results in both visual and other auditory areas, these findings suggest that neurons whose responses reflect complex interactions between stimulus position and eye position set the stage for the eventual convergence of auditory and visual information.
Abstract& We investigated the format of the code for sound location in the inferior colliculi of three awake monkeys (Macaca mulatta). We found that roughly half of our sample of 99 neurons was sensitive to the free-field locations of broadband noise presented in the frontal hemisphere. Such neurons nearly always responded monotonically as a function of sound azimuth, with stronger responses for more contralateral sound locations. Few, if any, neurons had circumscribed receptive fields. Spatial sensitivity was broad: the proportion of the total sample of neurons responding to a sound at a given location ranged from 30% for ipsilateral locations to 80% for contralateral locations. These findings suggest that sound azimuth is represented via a population rate code of very broadly responsive neurons in primate inferior colliculi. This representation differs in format from the place code used for encoding the locations of visual and tactile stimuli and poses problems for the eventual convergence of auditory and visual or somatosensory signals. Accordingly, models for converting this representation into a place code are discussed. &
Auditory spatial information arises in a head-centered coordinate frame, whereas the saccade command signals generated by the superior colliculus (SC) are thought to specify target locations in an eye-centered frame. However, auditory activity in the SC appears to be neither head- nor eye-centered but in a reference frame that is intermediate between both of these reference frames. This neurophysiological finding suggests that auditory saccades might not fully compensate for changes in initial eye position. Here, we investigated whether the accuracy of saccades to sounds is affected by initial eye position in rhesus monkeys. We found that, on average, a 12 degrees horizontal shift in initial eye position produced only a 0.6 to 1.6 degrees horizontal shift in the endpoints of auditory saccades made to targets at a range of locations along the horizontal meridian. This shift was similar in size to the modest influence of eye position on visual saccades. This virtually complete compensation for initial eye position implies that auditory activity in the SC is read out in a manner that is appropriate for generating accurate saccades to sounds.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.