A critical step in self-motion perception and spatial awareness is the integration of motion cues from multiple sensory organs that individually do not provide an accurate representation of the physical world. One of the best-studied sensory ambiguities is found in visual processing, and arises because of the inherent uncertainty in detecting the motion direction of an untextured contour moving within a small aperture. A similar sensory ambiguity arises in identifying the actual motion associated with linear accelerations sensed by the otolith organs in the inner ear. These internal linear accelerometers respond identically during translational motion (for example, running forward) and gravitational accelerations experienced as we reorient the head relative to gravity (that is, head tilt). Using new stimulus combinations, we identify here cerebellar and brainstem motion-sensitive neurons that compute a solution to the inertial motion detection problem. We show that the firing rates of these populations of neurons reflect the computations necessary to construct an internal model representation of the physical equations of motion.
The ability to orient and navigate through the terrestrial environment represents a computational challenge common to all vertebrates. It arises because motion sensors in the inner ear, the otolith organs, and the semicircular canals transduce self-motion in an egocentric reference frame. As a result, vestibular afferent information reaching the brain is inappropriate for coding our own motion and orientation relative to the outside world. Here we show that cerebellar cortical neuron activity in vermal lobules 9 and 10 reflects the critical computations of transforming head-centered vestibular afferent information into earth-referenced self-motion and spatial orientation signals. Unlike vestibular and deep cerebellar nuclei neurons, where a mixture of responses was observed, Purkinje cells represent a homogeneous population that encodes inertial motion. They carry the earth-horizontal component of a spatially transformed and temporally integrated rotation signal from the semicircular canals, which is critical for computing head attitude, thus isolating inertial linear accelerations during navigation.
The ability to navigate in the world and execute appropriate behavioral responses depends critically on the contribution of the vestibular system to the detection of motion and spatial orientation. A complicating factor is that otolith afferents equivalently encode inertial and gravitational accelerations. Recent studies have demonstrated that the brain can resolve this sensory ambiguity by combining signals from both the otoliths and semicircular canal sensors, although it remains unknown how the brain integrates these sensory contributions to perform the nonlinear vector computations required to accurately detect head movement in space. Here, we illustrate how a physiologically relevant, nonlinear integrative neural network could be used to perform the required computations for inertial motion detection along the interaural head axis. The proposed model not only can simulate recent behavioral observations, including a translational vestibuloocular reflex driven by the semicircular canals, but also accounts for several previously unexplained characteristics of central neural responses such as complex otolith-canal convergence patterns and the prevalence of dynamically processed otolith signals. A key model prediction, implied by the required computations for tilt-translation discrimination, is a coordinate transformation of canal signals from a head-fixed to a spatial reference frame. As a result, cell responses may reflect canal signal contributions that cannot be easily detected or distinguished from otolith signals. New experimental protocols are proposed to characterize these cells and identify their contributions to spatial motion estimation. The proposed theoretical framework makes an essential first link between the computations for inertial acceleration detection derived from the physical laws of motion and the neural response properties predicted in a physiologically realistic network implementation.
The ability to navigate in the world and execute appropriate behavioral and motor responses depends critically on our capacity to construct an accurate internal representation of our current motion and orientation in space. Vestibular sensory signals are among those that may make an essential contribution to the construction of such 'internal models'. Movement in a gravitational environment represents a situation where the construction of internal models becomes particularly important because the otolith organs, like any linear accelerometer, sense inertial and gravitational accelerations equivalently. Otolith afferents thus provide inherently ambiguous motion information, as they respond identically to translation and head reorientation relative to gravity. Resolution of this ambiguity requires the nonlinear integration of linear acceleration and angular velocity cues, as predicted by the physical equations of motion. Here, we summarize evidence that during translations and tilts from upright the firing rates of brainstem and cerebellar neurons encode a combination of dynamically processed semicircular canal and otolith signals appropriate to construct an internal model representation of the computations required for inertial motion detection.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.