During self-motion, humans typically move the eyes to maintain fixation on the stationary environment around them. These eye movements could in principle be used to estimate self-motion, but their impact on perception is unknown. We had participants judge self-motion during different eye-movement conditions in the absence of full-field optic flow. In a two-alternative forced choice task, participants indicated whether the second of two successive passive lateral whole-body translations was longer or shorter than the first. This task was used in two experiments. In the first (n = 8), eye movements were constrained differently in the two translation intervals by presenting either a world-fixed or body-fixed fixation point or no fixation point at all (allowing free gaze). Results show that perceived translations were shorter with a body-fixed than a world-fixed fixation point. A linear model indicated that eye-movement signals received a weight of ∼25% for the self-motion percept. This model was independently validated in the trials without a fixation point (free gaze). In the second experiment (n = 10), gaze was free during both translation intervals. Results show that the translation with the larger eye-movement excursion was judged more often to be larger than chance, based on an oculomotor choice probability analysis. We conclude that eye-movement signals influence self-motion perception, even in the absence of visual stimulation.
Estimation of the orientation of the head relative to the earth's vertical is thought to rely on the integration of vestibular and visual cues. The role of visual cues can be tested using a rod-and-frame task in which a global visual scene, typically a square frame, is displayed at different orientations together with a rod whose perceived direction is a proxy for the head-in-space estimate. While it is known that the frame biases this percept, and hence the subjective visual vertical, the possible role of the rod itself in this processing has not been examined. Current models about spatial orientation assume that the visual orientation of the rod and its uncertainty play no role in the visual-vestibular integration process, but are only involved in the transformation that yields rod orientation in space, thereby contributing additive noise to the subjective visual vertical. Here we tested the validity of this assumption in the rod-and-frame task by replacing the rod with an ellipse whose orientation uncertainty was manipulated by varying its eccentricity (i.e., making the ellipse more or less rounded). Using a psychophysical approach, subjects performed this ellipse-and-frame task for three different eccentricities of the ellipse (0.74, 0.82, 0.99) and three frame orientations (À17.58, 08, 17.58). Results show that ellipse eccentricity affects the uncertainty but not the bias of the subjective visual vertical, suggesting that the ellipse does not interact with the frame in global visual processing but contributes additive noise in computing its orientation in world coordinates.
The percept of vertical, which mainly relies on vestibular and visual cues, is known to be affected after sustained whole-body roll tilt, mostly at roll positions adjacent to the position of adaptation. Here we ask whether the viewing of panoramic visual cues during the adaptation further influences the percept of the visual vertical. Participants were rotated in the frontal plane to a 90° clockwise tilt position, which was maintained for 4-minutes. During this period, the subject was either kept in darkness, or viewed panoramic pictures that were either veridical (aligned with gravity) or oriented along the body longitudinal axis. Errors of the subsequent subjective visual vertical (SVV), measured at various tilt angles, showed that the adaptation effect of panoramic cues is local, i.e. for a narrow range of tilts in the direction of the adaptation angle. This distortion was found irrespective of the orientation of the panoramic cues. We conclude that sustained exposure to panoramic and vestibular cues does not adapt the subsequent percept of vertical to the direction of the panoramic cue. Rather, our results suggest that sustained panoramic cues affect the SVV by an indirect effect on head orientation, with a 90° periodicity, that interacts with a vestibular cue to determine the percept of vertical.
The vestibular system provides information for spatial orientation. However, this information is ambiguous: because the otoliths sense the gravitoinertial force, they cannot distinguish gravitational and inertial components. As a consequence, prolonged linear acceleration of the head can be interpreted as tilt, referred to as the somatogravic effect. Previous modeling work suggests that the brain disambiguates the otolith signal according to the rules of Bayesian inference, combining noisy canal cues with the a priori assumption that prolonged linear accelerations are unlikely. Within this modeling framework the noise of the vestibular signals affects the dynamic characteristics of the tilt percept during linear whole-body motion. To test this prediction, we devised a novel paradigm to psychometrically characterize the dynamic visual vertical-as a proxy for the tilt percept-during passive sinusoidal linear motion along the interaural axis (0.33 Hz motion frequency, 1.75 m/s peak acceleration, 80 cm displacement). While subjects (=10) kept fixation on a central body-fixed light, a line was briefly flashed (5 ms) at different phases of the motion, the orientation of which had to be judged relative to gravity. Consistent with the model's prediction, subjects showed a phase-dependent modulation of the dynamic visual vertical, with a subject-specific phase shift with respect to the imposed acceleration signal. The magnitude of this modulation was smaller than predicted, suggesting a contribution of nonvestibular signals to the dynamic visual vertical. Despite their dampening effect, our findings may point to a link between the noise components in the vestibular system and the characteristics of dynamic visual vertical. A fundamental question in neuroscience is how the brain processes vestibular signals to infer the orientation of the body and objects in space. We show that, under sinusoidal linear motion, systematic error patterns appear in the disambiguation of linear acceleration and spatial orientation. We discuss the dynamics of these illusory percepts in terms of a dynamic Bayesian model that combines uncertainty in the vestibular signals with priors based on the natural statistics of head motion.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.