The perception of self-motion direction, or heading, relies on integration of multiple sensory cues, especially from the visual and vestibular systems. However, the reliability of sensory information can vary rapidly and unpredictably, and it remains unclear how the brain integrates multiple sensory signals given this dynamic uncertainty. Human psychophysical studies have shown that observers combine cues by weighting them in proportion to their reliability, consistent with statistically optimal integration schemes derived from Bayesian probability theory. Remarkably, because cue reliability is varied randomly across trials, the perceptual weight assigned to each cue must change from trial to trial. Dynamic cue reweighting has not been examined for combinations of visual and vestibular cues, nor has the Bayesian cue integration approach been applied to laboratory animals, an important step toward understanding the neural basis of cue integration. To address these issues, we tested human and monkey subjects in a heading discrimination task involving visual (optic flow) and vestibular (translational motion) cues. The cues were placed in conflict on a subset of trials, and their relative reliability was varied to assess the weights that subjects gave to each cue in their heading judgments. We found that monkeys can rapidly reweight visual and vestibular cues according to their reliability, the first such demonstration in a nonhuman species. However, some monkeys and humans tended to over-weight vestibular cues, inconsistent with simple predictions of a Bayesian model. Nonetheless, our findings establish a robust model system for studying the neural mechanisms of dynamic cue reweighting in multisensory perception.
Multisensory calibration is fundamental for proficient interaction within a changing environment. Initial studies suggested a visual-dominant mechanism. More recently, a cue-reliability based model, similar to optimal cue-integration, has been proposed. However, a more general, reliability-independent model of fixed-ratio adaptation (of which visual-dominance is a sub-case) has never been tested. Here, we studied behavior of both humans and monkeys performing a heading-discrimination task. Subjects were presented with either visual (optic-flow), vestibular (motion-platform) or combined (visual/vestibular) stimuli, and required to report whether self-motion was to the right/left of straight ahead. A systematic heading-discrepancy was introduced between the visual and vestibular cues, without external feedback. Cue-calibration was measured by the resulting sensory adaptation. Both visual and vestibular cues significantly adapted in the direction required to reduce cue-conflict. However, unlike multisensory cue-integration, cue-calibration was not reliability-based. Rather, a model of fixed-ratio adaptation best described the data, whereby vestibular adaptation was greater than visual adaptation, irrespective of relative cue-reliability. The average ratio of vestibular to visual adaptation was 1.75 and 2.30 for the human and monkey data, respectively. Furthermore, only through modeling fixed-ratio adaptation (using the ratio extracted from the data), were we were able to account for reliability-based cue-integration during the adaptation process. The finding that cue-calibration does not depend on cue-reliability is consistent with the notion that it follows an underlying estimate of cue-accuracy. Cue-accuracy is generally independent of cue-reliability and its estimate may change with a much slower time-constant. Thus, greater vestibular vs. visual (fixed-ratio) adaptation suggests lower vestibular vs. visual cue-accuracy.
MacNeilage PR, Turner AH, Angelaki DE. Canal-otolith interactions and detection thresholds of linear and angular components during curved-path self-motion. J Neurophysiol 104: 765-773, 2010. First published June 16, 2010 doi:10.1152/jn.01067.2009. Gravitational signals arising from the otolith organs and vertical plane rotational signals arising from the semicircular canals interact extensively for accurate estimation of tilt and inertial acceleration. Here we used a classical signal detection paradigm to examine perceptual interactions between otolith and horizontal semicircular canal signals during simultaneous rotation and translation on a curved path. In a rotation detection experiment, blindfolded subjects were asked to detect the presence of angular motion in blocks where half of the trials were pure nasooccipital translation and half were simultaneous translation and yaw rotation (curved-path motion). In separate, translation detection experiments, subjects were also asked to detect either the presence or the absence of nasooccipital linear motion in blocks, in which half of the trials were pure yaw rotation and half were curved path. Rotation thresholds increased slightly, but not significantly, with concurrent linear velocity magnitude. Yaw rotation detection threshold, averaged across all conditions, was 1.45 Ϯ 0.81°/s (3.49 Ϯ 1.95°/s 2 ). Translation thresholds, on the other hand, increased significantly with increasing magnitude of concurrent angular velocity. Absolute nasooccipital translation detection threshold, averaged across all conditions, was 2.93 Ϯ 2.10 cm/s (7.07 Ϯ 5.05 cm/s 2 ). These findings suggest that conscious perception might not have independent access to separate estimates of linear and angular movement parameters during curved-path motion. Estimates of linear (and perhaps angular) components might instead rely on integrated information from canals and otoliths. Such interaction may underlie previously reported perceptual errors during curved-path motion and may originate from mechanisms that are specialized for tilt-translation processing during vertical plane rotation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.