Spatial information is conveyed to the primary visual cortex in retinal coordinates. Movement trajectory programming, however, requires a transformation from this sensory frame of reference into a frame appropriate for the selected part of the body, such as the eye, head or arms. To achieve this transformation, visual information must be combined with information from other sources: for instance, the location of an object of interest can be defined with respect to the observer's head if the position of the eyes in the orbit is known and is added to the object's retinal coordinates. Here we show that in a subdivision of the monkey parietal lobe, the ventral intraparietal area (VIP), the activity of visual neurons is modulated by eye-position signals, as in many other areas of the cortical visual system. We find that individual receptive fields of a population of VIP neurons are organized along a continuum, from eye to head coordinates. In the latter case, neurons encode the azimuth and/or elevation of a visual stimulus, independently of the direction in which the eyes are looking, thus representing spatial locations explicitly in at least a head-centred frame of reference.
Self-motion detection requires the interaction of a number of sensory systems for correct perceptual interpretation of a given movement and an eventual motor response. Parietal cortical areas are thought to play an important role in this function, and we have thus studied the encoding of multimodal signals and their spatiotemporal interactions in the ventral intraparietal area of macaque monkeys. Thereby, we have identified for the first time the presence of vestibular sensory input to this area and described its interaction with somatosensory and visual signals, via extracellular single-cell recordings in awake head-fixed animals. Visual responses were driven by large field stimuli that simulated either backward or forward self-motion (contraction or expansion stimuli, respectively), or movement in the frontoparallel plane (visual increments moving simultaneously in the same direction). While the dominant sensory modality in most neurons was visual, about one third of all recorded neurons responded to horizontal rotation. These vestibular responses were typically in phase with head velocity, but in some cases they could signal acceleration or even showed integration to position. The associated visual responses were always codirectional with the vestibular on-direction, i.e. noncomplementary. Somatosensory responses were in register with the visual preferred direction, either in the same or in the opposite direction, thus signalling translation or rotation in the horizontal plane. These results, taken together with data on responses to optic flow stimuli obtained in a parallel study, strongly suggest an involvement of area VIP in the analysis and the encoding of self-motion.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.