The otoliths are stimulated in the same fashion by gravitational and inertial forces, so otolith signals are ambiguous indicators of self-orientation. The ambiguity can be resolved with added visual information indicating orientation and acceleration with respect to the earth. Here we present a Bayesian model of the statistically optimal combination of noisy vestibular and visual signals. Likelihoods associated with sensory measurements are represented in an orientation/acceleration space. The likelihood function associated with the otolith signal illustrates the ambiguity; there is no unique solution for self-orientation or acceleration. Likelihood functions associated with other sensory signals can resolve this ambiguity. In addition, we propose two priors, each acting on a dimension in the orientation/acceleration space: the idiotropic prior and the no-acceleration prior. We conducted experiments using a motion platform and attached visual display to examine the influence of visual signals on the interpretation of the otolith signal. Subjects made pitch and acceleration judgments as the vestibular and visual signals were manipulated independently. Predictions of the model were confirmed: (1) visual signals affected the interpretation of the otolith signal, (2) less variable signals had more influence on perceived orientation and acceleration than more variable ones, and (3) combined estimates were more precise than single-cue estimates. We also show that the model can explain some well-known phenomena including the perception of upright in zero gravity, the Aubert effect, and the somatogravic illusion.
Effective navigation and locomotion depend critically on an observer's ability to judge direction of linear self-motion, i.e., heading. The vestibular cue to heading is the direction of inertial acceleration that accompanies transient linear movements. This cue is transduced by the otolith organs. The otoliths also respond to gravitational acceleration, so vestibular heading discrimination could depend on (1) the direction of movement in head coordinates (i.e., relative to the otoliths), (2) the direction of movement in world coordinates (i.e., relative to gravity), or (3) body orientation (i.e., the direction of gravity relative to the otoliths). To quantify these effects, we measured vestibular and visual discrimination of heading along azimuth and elevation dimensions with observers oriented both upright and side-down relative to gravity. We compared vestibular heading thresholds with corresponding measurements of sensitivity to linear motion along lateral and vertical axes of the head (coarse direction discrimination and amplitude discrimination). Neither heading nor coarse direction thresholds depended on movement direction in world coordinates, demonstrating that the nervous system compensates for gravity. Instead, they depended similarly on movement direction in head coordinates (better performance in the horizontal plane) and on body orientation (better performance in the upright orientation). Heading thresholds were correlated with, but significantly larger than, predictions based on sensitivity in the coarse discrimination task. Simulations of a neuron/anti-neuron pair with idealized cosine-tuning properties show that heading thresholds larger than those predicted from coarse direction discrimination could be accounted for by an amplitude-response nonlinearity in the neural representation of inertial motion.
Heading estimation is vital to everyday navigation and locomotion. Despite extensive behavioral and physiological research on both visual and vestibular heading estimation over more than two decades, the accuracy of heading estimation has not yet been systematically evaluated. Therefore human visual and vestibular heading estimation was assessed in the horizontal plane using a motion platform and stereo visual display. Heading angle was overestimated during forward movements and underestimated during backward movements in response to both visual and vestibular stimuli, indicating an overall multimodal bias toward lateral directions. Lateral biases are consistent with the overrepresentation of lateral preferred directions observed in neural populations that carry visual and vestibular heading information, including MSTd and otolith afferent populations. Due to this overrepresentation, population vector decoding yields patterns of bias remarkably similar to those observed behaviorally. Lateral biases are inconsistent with standard Bayesian accounts which predict that estimates should be biased toward the most common straight forward heading direction. Nevertheless, lateral biases may be functionally relevant. They effectively constitute a perceptual scale expansion around straight ahead which could allow for more precise estimation and provide a high gain feedback signal to facilitate maintenance of straight-forward heading during everyday navigation and locomotion.
Spatial orientation is the sense of body orientation and self-motion relative to the stationary environment, fundamental to normal waking behavior and control of everyday motor actions including eye movements, postural control, and locomotion. The brain achieves spatial orientation by integrating visual, vestibular, and somatosensory signals. Over the past years, considerable progress has been made toward understanding how these signals are processed by the brain using multiple computational approaches that include frequency domain analysis, the concept of internal models, observer theory, Bayesian theory, and Kalman filtering. Here we put these approaches in context by examining the specific questions that can be addressed by each technique and some of the scientific insights that have resulted. We conclude with a recent application of particle filtering, a probabilistic simulation technique that aims to generate the most likely state estimates by incorporating internal models of sensor dynamics and physical laws and noise associated with sensory processing as well as prior knowledge or experience. In this framework, priors for low angular velocity and linear acceleration can explain the phenomena of velocity storage and frequency segregation, both of which have been modeled previously using arbitrary low-pass filtering. How Kalman and particle filters may be implemented by the brain is an emerging field. Unlike past neurophysiological research that has aimed to characterize mean responses of single neurons, investigations of dynamic Bayesian inference should attempt to characterize population activities that constitute probabilistic representations of sensory and prior information.
Judging object trajectory during self-motion is a fundamental ability for mobile organisms interacting with their environment. This fundamental ability requires the nervous system to compensate for the visual consequences of self-motion in order to make accurate judgments, but the mechanisms of this compensation are poorly understood. We comprehensively examined both the accuracy and precision of observers' ability to judge object trajectory in the world when self-motion was defined by vestibular, visual, or combined visual-vestibular cues. Without decision feedback, subjects demonstrated no compensation for self-motion that was defined solely by vestibular cues, partial compensation (47%) for visually defined self-motion, and significantly greater compensation (58%) during combined visual-vestibular self-motion. With decision feedback, subjects learned to accurately judge object trajectory in the world, and this generalized to novel self-motion speeds. Across conditions, greater compensation for self-motion was associated with decreased precision of object trajectory judgments, indicating that self-motion compensation comes at the cost of reduced discriminability. Our findings suggest that the brain can flexibly represent object trajectory relative to either the observer or the world, but a world-centered representation comes at the cost of decreased precision due to the inclusion of noisy self-motion signals.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.