Visual and somatosensory signals participate together in providing an estimate of the hand's spatial location. While the ability of subjects to identify the spatial location of their hand based on visual and proprioceptive signals has previously been characterized, relatively few studies have examined in detail the spatial structure of the proprioceptive map of the arm. Here, we reconstructed and analyzed the spatial structure of the estimation errors that resulted when subjects reported the location of their unseen hand across a 2D horizontal workspace. Hand position estimation was mapped under four conditions: with and without tactile feedback, and with the right and left hands. In the task, we moved each subject's hand to one of 100 targets in the workspace while their eyes were closed. Then, we either a) applied tactile stimulation to the fingertip by allowing the index finger to touch the target or b) as a control, hovered the fingertip 2 cm above the target. After returning the hand to a neutral position, subjects opened their eyes to verbally report where their fingertip had been. We measured and analyzed both the direction and magnitude of the resulting estimation errors. Tactile feedback reduced the magnitude of these estimation errors, but did not change their overall structure. In addition, the spatial structure of these errors was idiosyncratic: each subject had a unique pattern of errors that was stable between hands and over time. Finally, we found that at the population level the magnitude of the estimation errors had a characteristic distribution over the workspace: errors were smallest closer to the body. The stability of estimation errors across conditions and time suggests the brain constructs a proprioceptive map that is reliable, even if it is not necessarily accurate. The idiosyncrasy across subjects emphasizes that each individual constructs a map that is unique to their own experiences.
Somatosensation is divided into multiple discrete modalities that we think of separably: e.g., tactile, proprioceptive, and temperature sensation. However, in processes such as haptics,those modalities all interact. If one intended to artificially generate a sensation that could be used for stereognosis, for example, it would be crucial to understand these interactions. We are presently examining the relationship between tactile and proprioceptive modalities in this context. In this overview of some of our recent work, we show that signals that would normally be attributed to two of these systems separately, tactile contact and self-movement, interact both perceptually and physiologically in ways that complicate the understanding of haptic processing. In the first study described here, we show that a tactile illusion on the fingertips, the cutaneous rabbit effect, can be abolished by changing the posture of the fingers. We then discuss activity in primary somatosensory cortical neurons illustrating the interrelationship of tactile and postural signals. In this study, we used a robot-enhanced virtual environment to show that many neurons in primary somatosensory cortex with cutaneous receptive fields encode elements both of tactile contact and self-motion. We then show the results of studies examining the structure of the process which extracts the spatial location of the hand from proprioceptive signals. The structure of the spatial errors in these maps indicates that the proprioceptive-spatial map is stable but individually constructed.These seemingly disparate studies lead us to suggest that tactile sensation is encoded in a 2-D map, but one which undergoes continual dynamic modification by an underlying proprioceptive map. Understanding how the disparate signals that comprise the somatosensory system are processed to produce sensation is an important step in realizing the kind of seamless integration aspired to in neuroprosthetics.
Current myoelectric prosthetic limbs are limited in their ability to provide direct sensory feedback to users, which increases attentional demands and reliance on visual cues. Vibrotactile sensory substitution (VSS), which can be used to provide sensory feedback in a non-invasive manner has demonstrated some improvement in myoelectric hand control. In this work, we developed and tested two VSS configurations: one with a single burst-rate modulated actuator and another with a spatially distributed array of five coin tactors. We performed a direct comparative assessment of these two VSS configurations with able-bodied subjects to investigate sensory perception, myoelectric control of grasp force and hand aperture with a prosthesis, and the effects of interface compliance. Six subjects completed a sensory perception experiment under a stimulation only paradigm; sixteen subjects completed experiments to compare VSS performance on perception and graded myoelectric control during grasp force and hand aperture tasks; and ten subjects completed experiments to investigate the effect of mechanical compliance of the myoelectric hand on the ability to control grasp force. Results indicated that sensory perception of vibrotactile feedback was not different for the two VSS configurations in the absence of active myoelectric control, but it was better with feedback from the coin tactor array than with the single actuator during myoelectric control of grasp force. Graded myoelectric control of grasp force and hand aperture was better with feedback from the coin tactor array than with the single actuator, and myoelectric control of grasp force was improved with a compliant grasp interface. Further investigations with VSS should focus on the use of coin tactor arrays by subjects with amputation in real-world settings and on improving control of grasp force by increasing the mechanical compliance of the hand.
The natural world continuously presents us with many opportunities for action, and thus a process of target selection must precede action execution. While there has been considerable progress in understanding target selection in stationary environments, little is known about target selection when we are in motion. Here we investigated the effect of self-motion signals on saccadic target selection in a dynamic environment. Human subjects were sinusoidally translated (f = 0.6 Hz, 30-cm peak-to-peak displacement) along an interaural axis with a vestibular sled. During the motion two visual targets were presented asynchronously but equidistantly on either side of fixation. Subjects had to look at one of these targets as quickly as possible. With an adaptive approach, the time delay between these targets was adjusted until the subject selected both targets equally often. We determined this balanced time delay for different phases of the motion in order to distinguish the effects of body acceleration and velocity on saccadic target selection. Results show that acceleration (or position, as these are indistinguishable during sinusoidal motion), but not velocity, affects target selection for saccades. Subjects preferred to look at targets in the direction of the acceleration—the leftward target was preferred when the sled accelerated to the left, and vice versa. Saccadic reaction times mimicked this selection bias by being reliably shorter to targets in the direction of acceleration. Our results provide evidence that saccade target selection mechanisms are modulated by self-motion signals, which could be derived directly from the otolith system.
Neuroprosthetic limbs, regardless of their sophisticated motor control, require sensory feedback to viably interact with the environment. Toward that aim, the authors examined interrelationships between tactile and proprioceptive sensations. Through human psychophysics experiments, they evaluated error patterns of subjects estimating hand location in a horizontal 2-dimensional workspace under 3 tactile conditions. While tactile cues did not significantly affect the structure of the pattern of errors, touching the workspace reduced estimation errors. During neurophysiological experiments, a macaque grasped textured objects using 2 hand postures. Sensory coding showed dependence on both roughness of the manipulandum and posture. In summary, the authors suggest that tactile sensations underlying haptics are processed in a stable spatial reference frame provided by a proprioceptive system, and that tactile and proprioceptive inputs can be encoded simultaneously by individual cells. Such insights will be useful for providing stable, adaptive sensory feedback for neuroprosthetics.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.