Interactions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here, we show a multimodal interaction evident at the eardrum. Ear canal microphone measurements in humans (n = 19 ears in 16 subjects) and monkeys (n = 5 ears in three subjects) performing a saccadic eye movement task to visual targets indicated that the eardrum moves in conjunction with the eye movement. The eardrum motion was oscillatory and began as early as 10 ms before saccade onset in humans or with saccade onset in monkeys. These eardrum movements, which we dub eye movement-related eardrum oscillations (EMREOs), occurred in the absence of a sound stimulus. The amplitude and phase of the EMREOs depended on the direction and horizontal amplitude of the saccade. They lasted throughout the saccade and well into subsequent periods of steady fixation. We discuss the possibility that the mechanisms underlying EMREOs create eye movement-related binaural cues that may aid the brain in evaluating the relationship between visual and auditory stimulus locations as the eyes move.reference frame | otoacoustic emissions | middle ear muscles | saccade | EMREO V isual information can aid hearing, such as when lip reading cues facilitate speech comprehension. To derive such benefits, the brain must first link visual and auditory signals that arise from common locations in space. In species with mobile eyes (e.g., humans, monkeys), visual and auditory spatial cues bear no fixed relationship to one another but change dramatically and frequently as the eyes move, about three times per second over an 80°range of space. Accordingly, considerable effort has been devoted to determining where and how the brain incorporates information about eye movements into the visual and auditory processing streams (1). In the primate brain, all of the regions previously evaluated have shown some evidence that eye movements modulate auditory processing [inferior colliculus (2-6), auditory cortex (7-9), parietal cortex (10-12), and superior colliculus (13-17)]. Such findings raise the question of where in the auditory pathway eye movements first impact auditory processing. In this study, we tested whether eye movements affect processing in the auditory periphery.The auditory periphery possesses at least two means of tailoring its processing in response to descending neural control (Fig. 1). First, the middle ear muscles (MEMs), the stapedius and tensor tympani, attach to the ossicles that connect the eardrum to the oval window of the cochlea. Contraction of these muscles tugs on the ossicular chain, modulating middle ear sound transmission and moving the eardrum. Second, within the cochlea, the outer hair cells (OHCs) are mechanically active and modify the motion of both the basilar membrane and, through mechanical coupling via the ossicles, the eardrum [i.e., otoacoustic emissions (OAEs)]. In short, the actions of the MEMs and OHCs affect not only the response to incoming sound but also transmi...
Interactions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here, we show a multimodal interaction evident at the eardrum. Ear canal microphone measurements in humans (n = 19 ears in 16 subjects) and monkeys (n = 5 ears in three subjects) performing a saccadic eye movement task to visual targets indicated that the eardrum moves in conjunction with the eye movement. The eardrum motion was oscillatory and began as early as 10 ms before saccade onset in humans or with saccade onset in monkeys. These eardrum movements, which we dub eye movement-related eardrum oscillations (EMREOs), occurred in the absence of a sound stimulus. The amplitude and phase of the EMREOs depended on the direction and horizontal amplitude of the saccade. They lasted throughout the saccade and well into subsequent periods of steady fixation. We discuss the possibility that the mechanisms underlying EMREOs create eye movement-related binaural cues that may aid the brain in evaluating the relationship between visual and auditory stimulus locations as the eyes move.reference frame | otoacoustic emissions | middle ear muscles | saccade | EMREO V isual information can aid hearing, such as when lip reading cues facilitate speech comprehension. To derive such benefits, the brain must first link visual and auditory signals that arise from common locations in space. In species with mobile eyes (e.g., humans, monkeys), visual and auditory spatial cues bear no fixed relationship to one another but change dramatically and frequently as the eyes move, about three times per second over an 80°range of space. Accordingly, considerable effort has been devoted to determining where and how the brain incorporates information about eye movements into the visual and auditory processing streams (1). In the primate brain, all of the regions previously evaluated have shown some evidence that eye movements modulate auditory processing [inferior colliculus (2-6), auditory cortex (7-9), parietal cortex (10-12), and superior colliculus (13-17)]. Such findings raise the question of where in the auditory pathway eye movements first impact auditory processing. In this study, we tested whether eye movements affect processing in the auditory periphery.The auditory periphery possesses at least two means of tailoring its processing in response to descending neural control (Fig. 1). First, the middle ear muscles (MEMs), the stapedius and tensor tympani, attach to the ossicles that connect the eardrum to the oval window of the cochlea. Contraction of these muscles tugs on the ossicular chain, modulating middle ear sound transmission and moving the eardrum. Second, within the cochlea, the outer hair cells (OHCs) are mechanically active and modify the motion of both the basilar membrane and, through mechanical coupling via the ossicles, the eardrum [i.e., otoacoustic emissions (OAEs)]. In short, the actions of the MEMs and OHCs affect not only the response to incoming sound but also transmi...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.