There is considerable evidence that the cerebellum plays a vital role in motor learning by constructing an estimate of the sensory consequences of movement. Theory suggests this estimate is compared with the actual sensory feedback to drive motor learning. However, direct proof for the existence of this comparison is still lacking. Here we carried out a trial-by-trial analysis of cerebellar neurons during the execution and adaptation of voluntary head movements, and found that neuronal sensitivities dynamically track the comparison of predictive and feedback signals. When the relationship between the motor command and resultant movement was altered, neurons robustly responded to sensory input as if the movement was externally-generated. Neuronal sensitivities then declined with the same time course as the concurrent behavioral learning. These findings demonstrate the output of an elegant computation in which rapid updating of an internal model enables the motor system to learn to expect unexpected sensory inputs.
SUMMARY Background The ability to distinguish sensory signals that register unexpected events (exafference) from those generated by voluntary actions (reafference) during self-motion is essential for accurate perception and behavior. The cerebellum is most commonly considered in relation to its contributions to the fine-tuning of motor commands and sensori-motor calibration required for motor learning. During unexpected motion, however, the sensory prediction errors that drive motor learning potentially provide a neural basis for the computation underlying the distinction between reafference and exafference. Results Recording from monkeys during voluntary and applied self-motion, we demonstrate that individual cerebellar output neurons encode an explicit and selective representation of unexpected self-motion by means of an elegant computation that cancels the reafferent sensory effects of self-generated movements. During voluntary self-motion, the sensory responses of neurons that robustly encode unexpected movement are cancelled. Neurons with vestibular and proprioceptive responses to applied head and body movements are unresponsive when the same motion is self-generated. When sensory reafference and exafference are experienced simultaneously, individual neurons provide a precise estimate of the detailed time course of exafference. Conclusions These results provide an explicit solution to the longstanding problem of understanding mechanisms by which the brain anticipates the sensory consequences of our voluntary actions. Specifically by revealing a striking computation of a sensory prediction error signal that effectively distinguishes between the sensory consequences of self-generated and externally produced actions, our findings overturn the conventional thinking that the sensory errors coded by the cerebellum principally contribute to the fine-tuning of motor activity required for motor learning.
The ability to accurately control posture and perceive self motion and spatial orientation requires knowledge of both the motion of the head and body. However, while the vestibular sensors and nuclei directly encode head motion, no sensors directly encode body motion. Instead, the convergence of vestibular and neck proprioceptive inputs during self-motion is generally believed to underlie the ability to compute body motion. Here, we provide evidence that the brain explicitly computes an internal estimate of body motion at the level of single cerebellar neurons. Neuronal responses were recorded from the rostral fastigial nucleus, the most medial of the deep cerebellar nuclei, during whole-body, body-under-head, and head-on-body rotations. We found that approximately half of the neurons encoded the motion of the body-in-space, while the other half encoded the motion of the head-in-space in a manner similar to neurons in the vestibular nuclei. Notably, neurons encoding body motion responded to both vestibular and proprioceptive stimulation (accordingly termed bimodal neurons). In contrast, neurons encoding head motion were only sensitive to vestibular inputs (accordingly termed unimodal neurons). Comparison of the proprioceptive and vestibular responses of bimodal neurons further revealed similar tuning in response to changes in head-on-body position. We propose that the similarity in nonlinear processing of vestibular and proprioceptive signals underlies the accurate computation of body motion. Furthermore, the same neurons that encode body motion (i.e., bimodal neurons) most likely encode vestibular signals in a body referenced coordinate frame, since the integration of proprioceptive and vestibular information is required for both computations.
The ability to keep track of where we are going as we navigate through our environment requires knowledge of our ongoing location and orientation. In response to passively applied motion, the otolith organs of the vestibular system encode changes in the velocity and direction of linear self-motion (i.e., heading). When self-motion is voluntarily generated, proprioceptive and motor efference copy information is also available to contribute to the brain's internal representation of current heading direction and speed. However to date, how the brain integrates these extra-vestibular cues with otolith signals during active linear self-motion remains unknown. Here, to address this question, we compared the responses of macaque vestibular neurons during active and passive translations. Single-unit recordings were made from a subgroup of neurons at the first central stage of sensory processing in the vestibular pathways involved in postural control and the computation of self-motion perception. Neurons responded far less robustly to otolith stimulation during self-generated than passive head translations. Yet, the mechanism underlying the marked cancellation of otolith signals did not affect other characteristics of neuronal responses (i.e., baseline firing rate, tuning ratio, orientation of maximal sensitivity vector). Transiently applied perturbations during active motion further established that an otolith cancellation signal was only gated in conditions where proprioceptive sensory feedback matched the motor-based expectation. Together our results have important implications for understanding the brain's ability to ensure accurate postural and motor control, as well as perceptual stability, during active self-motion.
In everyday life, vestibular sensors are activated by both self-generated and externally applied head movements. The ability to distinguish inputs that are a consequence of our own actions (i.e., active motion) from those that result from changes in the external world (i.e., passive or unexpected motion) is essential for perceptual stability and accurate motor control. Recent work has made progress toward understanding how the brain distinguishes between these two kinds of sensory inputs. We have performed a series of experiments in which single-unit recordings were made from vestibular afferents and central neurons in alert macaque monkeys during rotation and translation. Vestibular afferents showed no differences in firing variability or sensitivity during active movements when compared to passive movements. In contrast, the analyses of neuronal firing rates revealed that neurons at the first central stage of vestibular processing (i.e., in the vestibular nuclei) were effectively less sensitive to active motion. Notably, however, this ability to distinguish between active and passive motion was not a general feature of early central processing, but rather was a characteristic of a distinct group of neurons known to contribute to postural control and spatial orientation. Our most recent studies have addressed how vestibular and proprioceptive inputs are integrated in the vestibular cerebellum, a region likely to be involved in generating an internal model of self-motion. We propose that this multimodal integration within the vestibular cerebellum is required for eliminating self-generated vestibular information from the subsequent computation of orientation and posture control at the first central stage of processing.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.