The sensory signals that drive movement planning arrive in a variety of “reference frames”, so integrating or comparing them requires sensory transformations. We propose a model where the statistical properties of sensory signals and their transformations determine how these signals are used. This model captures the patterns of gaze-dependent errors found in our human psychophysics experiment when the sensory signals available for reach planning are varied. These results challenge two widely held ideas: error patterns directly reflect the reference frame of the underlying neural representation, and it is preferable to use a single common reference frame for movement planning. We show that gaze-dependent error patterns, often cited as evidence for retinotopic reach planning, can be explained by a transformation bias and are not exclusively linked to retinotopic representations. Further, the presence of multiple reference frames allows for optimal use of available sensory information and explains task-dependent reweighting of sensory signals.
The planning and control of sensory-guided movements requires the integration of multiple sensory streams. Although the information conveyed by different sensory modalities is often overlapping, the shared information is represented differently across modalities during the early stages of cortical processing. We ask how these diverse sensory signals are represented in multimodal sensorimotor areas of cortex in macaque monkeys. While a common modality-independent representation might facilitate downstream readout, previous studies have found that modality-specific representations in multimodal cortex reflect earlier spatial representations, for example visual signals have a more eye-centered representation. We recorded neural activity from two parietal areas involved in reach planning, Area 5 and the medial intraparietal area (MIP), as animals reached to visual, combined visual and proprioceptive, and proprioceptive targets while fixing their gaze on another location. In contrast to other multimodal cortical areas, the same spatial representations are used to represent visual and proprioceptive signals in both Area 5 and MIP. However, these representations are heterogeneous. While we observed a posterior-to-anterior gradient in population responses in parietal cortex, from more eye-centered to more hand- or body-centered representations, we do not observe the simple and discrete reference frame representations suggested by studies that focused on identifying the “best match” reference frame for a given cortical area. In summary, we find modality-independent representations of spatial information in parietal cortex, though these representations are complex and heterogeneous.
Visuomotor coordination requires both the accurate alignment of spatial information from different sensory streams and the ability to convert these sensory signals into accurate motor commands. Both of these processes are highly plastic, as illustrated by the rapid adaptation of goal-directed movements following exposure to shifted visual feedback. Although visual-shift adaptation is a widely used model of sensorimotor learning, the multifaceted adaptive response is typically poorly quantified. We present an approach to quantitatively characterizing both sensory and task-dependent components of adaptation. Sensory aftereffects are quantified with "alignment tests" that provide a localized, two-dimensional measure of sensory recalibration. These sensory effects obey a precise form of "additivity," in which the shift in sensory alignment between vision and the right hand is equal to the vector sum of the shifts between vision and the left hand and between the right and left hands. This additivity holds at the exposure location and at a second generalization location. These results support a component transformation model of sensory coordination, in which eye-hand and hand-hand alignment relies on a sequence of shared sensory transformations. We also ask how these sensory effects compare with the aftereffects measured in target reaching and tracking tasks. We find that the aftereffect depends on both the task performed during feedback-shift exposure and on the testing task. The results suggest the presence of both a general sensory recalibration and task-dependent sensorimotor effect. The task-dependent effect is observed in highly stereotyped reaching movements, but not in the more variable tracking task.
Rodent whisker input consists of dense microvibration sequences that are often temporally integrated for perceptual discrimination. Whether primary somatosensory cortex (S1) participates in temporal integration is unknown. We trained rats to discriminate whisker impulse sequences that varied in single-impulse kinematics (5–20-ms time scale) and mean speed (150-ms time scale). Rats appeared to use the integrated feature, mean speed, to guide discrimination in this task, consistent with similar prior studies. Despite this, 52% of S1 units, including 73% of units in L4 and L2/3, encoded sequences at fast time scales (≤20 ms, mostly 5–10 ms), accurately reflecting single impulse kinematics. 17% of units, mostly in L5, showed weaker impulse responses and a slow firing rate increase during sequences. However, these units did not effectively integrate whisker impulses, but instead combined weak impulse responses with a distinct, slow signal correlated to behavioral choice. A neural decoder could identify sequences from fast unit spike trains and behavioral choice from slow units. Thus, S1 encoded fast time scale whisker input without substantial temporal integration across whisker impulses.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.