The inferior colliculus (IC) is a major processing center situated mid-way along both the ascending and descending auditory pathways of the brain stem. Although it is fundamentally an auditory area, the IC also receives anatomical input from non-auditory sources. Neurophysiological studies corroborate that non-auditory stimuli can modulate auditory processing in the IC and even elicit responses independent of coincident auditory stimulation. In this article, we review anatomical and physiological evidence for multisensory and other non-auditory processing in the IC. Specifically, the contributions of signals related to vision, eye movements and position, somatosensation, and behavioral context to neural activity in the IC will be described. These signals are potentially important for localizing sound sources, attending to salient stimuli, distinguishing environmental from self-generated sounds, and perceiving and generating communication sounds. They suggest that the IC should be thought of as a node in a highly interconnected sensory, motor, and cognitive network dedicated to synthesizing a higher-order auditory percept rather than simply reporting patterns of air pressure detected by the cochlea. We highlight some of the potential pitfalls that can arise from experimental manipulations that may disrupt the normal function of this network, such as the use of anesthesia or the severing of connections from cortical structures that project to the IC. Finally, we note that the presence of these signals in the IC has implications for our understanding not just of the IC but also of the multitude of other regions within and beyond the auditory system that are dependent on signals that pass through the IC. Whatever the IC “hears” would seem to be passed both “upward” to thalamus and thence to auditory cortex and beyond, as well as “downward” via centrifugal connections to earlier areas of the auditory pathway such as the cochlear nucleus.
Interactions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here, we show a multimodal interaction evident at the eardrum. Ear canal microphone measurements in humans (n = 19 ears in 16 subjects) and monkeys (n = 5 ears in three subjects) performing a saccadic eye movement task to visual targets indicated that the eardrum moves in conjunction with the eye movement. The eardrum motion was oscillatory and began as early as 10 ms before saccade onset in humans or with saccade onset in monkeys. These eardrum movements, which we dub eye movement-related eardrum oscillations (EMREOs), occurred in the absence of a sound stimulus. The amplitude and phase of the EMREOs depended on the direction and horizontal amplitude of the saccade. They lasted throughout the saccade and well into subsequent periods of steady fixation. We discuss the possibility that the mechanisms underlying EMREOs create eye movement-related binaural cues that may aid the brain in evaluating the relationship between visual and auditory stimulus locations as the eyes move.reference frame | otoacoustic emissions | middle ear muscles | saccade | EMREO V isual information can aid hearing, such as when lip reading cues facilitate speech comprehension. To derive such benefits, the brain must first link visual and auditory signals that arise from common locations in space. In species with mobile eyes (e.g., humans, monkeys), visual and auditory spatial cues bear no fixed relationship to one another but change dramatically and frequently as the eyes move, about three times per second over an 80°range of space. Accordingly, considerable effort has been devoted to determining where and how the brain incorporates information about eye movements into the visual and auditory processing streams (1). In the primate brain, all of the regions previously evaluated have shown some evidence that eye movements modulate auditory processing [inferior colliculus (2-6), auditory cortex (7-9), parietal cortex (10-12), and superior colliculus (13-17)]. Such findings raise the question of where in the auditory pathway eye movements first impact auditory processing. In this study, we tested whether eye movements affect processing in the auditory periphery.The auditory periphery possesses at least two means of tailoring its processing in response to descending neural control (Fig. 1). First, the middle ear muscles (MEMs), the stapedius and tensor tympani, attach to the ossicles that connect the eardrum to the oval window of the cochlea. Contraction of these muscles tugs on the ossicular chain, modulating middle ear sound transmission and moving the eardrum. Second, within the cochlea, the outer hair cells (OHCs) are mechanically active and modify the motion of both the basilar membrane and, through mechanical coupling via the ossicles, the eardrum [i.e., otoacoustic emissions (OAEs)]. In short, the actions of the MEMs and OHCs affect not only the response to incoming sound but also transmi...
Interactions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here, we show a multimodal interaction evident at the eardrum. Ear canal microphone measurements in humans (n = 19 ears in 16 subjects) and monkeys (n = 5 ears in three subjects) performing a saccadic eye movement task to visual targets indicated that the eardrum moves in conjunction with the eye movement. The eardrum motion was oscillatory and began as early as 10 ms before saccade onset in humans or with saccade onset in monkeys. These eardrum movements, which we dub eye movement-related eardrum oscillations (EMREOs), occurred in the absence of a sound stimulus. The amplitude and phase of the EMREOs depended on the direction and horizontal amplitude of the saccade. They lasted throughout the saccade and well into subsequent periods of steady fixation. We discuss the possibility that the mechanisms underlying EMREOs create eye movement-related binaural cues that may aid the brain in evaluating the relationship between visual and auditory stimulus locations as the eyes move.reference frame | otoacoustic emissions | middle ear muscles | saccade | EMREO V isual information can aid hearing, such as when lip reading cues facilitate speech comprehension. To derive such benefits, the brain must first link visual and auditory signals that arise from common locations in space. In species with mobile eyes (e.g., humans, monkeys), visual and auditory spatial cues bear no fixed relationship to one another but change dramatically and frequently as the eyes move, about three times per second over an 80°range of space. Accordingly, considerable effort has been devoted to determining where and how the brain incorporates information about eye movements into the visual and auditory processing streams (1). In the primate brain, all of the regions previously evaluated have shown some evidence that eye movements modulate auditory processing [inferior colliculus (2-6), auditory cortex (7-9), parietal cortex (10-12), and superior colliculus (13-17)]. Such findings raise the question of where in the auditory pathway eye movements first impact auditory processing. In this study, we tested whether eye movements affect processing in the auditory periphery.The auditory periphery possesses at least two means of tailoring its processing in response to descending neural control (Fig. 1). First, the middle ear muscles (MEMs), the stapedius and tensor tympani, attach to the ossicles that connect the eardrum to the oval window of the cochlea. Contraction of these muscles tugs on the ossicular chain, modulating middle ear sound transmission and moving the eardrum. Second, within the cochlea, the outer hair cells (OHCs) are mechanically active and modify the motion of both the basilar membrane and, through mechanical coupling via the ossicles, the eardrum [i.e., otoacoustic emissions (OAEs)]. In short, the actions of the MEMs and OHCs affect not only the response to incoming sound but also transmi...
Interactions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here, we show a multimodal interaction evident at the eardrum. Ear canal microphone measurements in humans (n = 19 ears in 16 subjects) and monkeys (n = 5 ears in three subjects) performing a saccadic eye movement task to visual targets indicated that the eardrum moves in conjunction with the eye movement. The eardrum motion was oscillatory and began as early as 10 ms before saccade onset in humans or with saccade onset in monkeys. These eardrum movements, which we dub eye movement-related eardrum oscillations (EMREOs), occurred in the absence of a sound stimulus. The amplitude and phase of the EMREOs depended on the direction and horizontal amplitude of the saccade. They lasted throughout the saccade and well into subsequent periods of steady fixation. We discuss the possibility that the mechanisms underlying EMREOs create eye movement-related binaural cues that may aid the brain in evaluating the relationship between visual and auditory stimulus locations as the eyes move.reference frame | otoacoustic emissions | middle ear muscles | saccade | EMREO V isual information can aid hearing, such as when lip reading cues facilitate speech comprehension. To derive such benefits, the brain must first link visual and auditory signals that arise from common locations in space. In species with mobile eyes (e.g., humans, monkeys), visual and auditory spatial cues bear no fixed relationship to one another but change dramatically and frequently as the eyes move, about three times per second over an 80°range of space. Accordingly, considerable effort has been devoted to determining where and how the brain incorporates information about eye movements into the visual and auditory processing streams (1). In the primate brain, all of the regions previously evaluated have shown some evidence that eye movements modulate auditory processing [inferior colliculus (2-6), auditory cortex (7-9), parietal cortex (10-12), and superior colliculus (13-17)]. Such findings raise the question of where in the auditory pathway eye movements first impact auditory processing. In this study, we tested whether eye movements affect processing in the auditory periphery.The auditory periphery possesses at least two means of tailoring its processing in response to descending neural control (Fig. 1). First, the middle ear muscles (MEMs), the stapedius and tensor tympani, attach to the ossicles that connect the eardrum to the oval window of the cochlea. Contraction of these muscles tugs on the ossicular chain, modulating middle ear sound transmission and moving the eardrum. Second, within the cochlea, the outer hair cells (OHCs) are mechanically active and modify the motion of both the basilar membrane and, through mechanical coupling via the ossicles, the eardrum [i.e., otoacoustic emissions (OAEs)]. In short, the actions of the MEMs and OHCs affect not only the response to incoming sound but also transmi...
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.