We describe use of a bidirectional neuromyoelectric prosthetic hand that conveys biomimetic sensory feedback. Electromyographic recordings from residual arm muscles were decoded to provide independent and proportional control of a six-DOF prosthetic hand and wrist—the DEKA LUKE arm. Activation of contact sensors on the prosthesis resulted in intraneural microstimulation of residual sensory nerve fibers through chronically implanted Utah Slanted Electrode Arrays, thereby evoking tactile percepts on the phantom hand. With sensory feedback enabled, the participant exhibited greater precision in grip force and was better able to handle fragile objects. With active exploration, the participant was also able to distinguish between small and large objects and between soft and hard ones. When the sensory feedback was biomimetic—designed to mimic natural sensory signals—the participant was able to identify the objects significantly faster than with the use of traditional encoding algorithms that depended on only the present stimulus intensity. Thus, artificial touch can be sculpted by patterning the sensory feedback, and biologically inspired patterns elicit more interpretable and useful percepts.
To effortlessly complete an intentional movement, the brain needs feedback from the body regarding the movement’s progress. This largely non-conscious kinesthetic sense helps the brain to learn relationships between motor commands and outcomes to correct movement errors. Prosthetic systems for restoring function have predominantly focused on controlling motorized joint movement. Without the kinesthetic sense, however, these devices do not become intuitively controllable. Here we report a method for endowing human amputees with a kinesthetic perception of dexterous robotic hands. Vibrating the muscles used for prosthetic control via a neural-machine interface produced the illusory perception of complex grip movements. Within minutes, three amputees integrated this kinesthetic feedback and improved movement control. Combining intent, kinesthesia, and vision instilled participants with a sense of agency over the robotic movements. This feedback approach for closed-loop control opens a pathway to seamless integration of minds and machines.
Multiperspective analysis reveals neurorobotic sensory and motor fusion in a bionic system promotes intrinsic neural behaviors.
Frontiers in Neuroscience | www.frontiersin.org February 2020 | Volume 14 | Article 120 Schofield et al.Long-Term Sensory-Motor-Integrated Prosthetic Arm Use a spectrum of performance changes following long-term use. Furthermore, after the take-home period, participants more appropriately integrated their prostheses into their body images and psychophysical tests provided strong evidence that neural and cortical adaptation occurred.
Fitts’ law models the relationship between amplitude, precision, and speed of rapid movements. It is widely used to quantify performance in pointing tasks, study human-computer interaction, and generally to understand perceptual-motor information processes, including research to model performance in isometric force production tasks. Applying Fitts’ law to an isometric grip force task would allow for quantifying grasp performance in rehabilitative medicine and may aid research on prosthetic control and design. We examined whether Fitts’ law would hold when participants attempted to accurately produce their intended force output while grasping a manipulandum when presented with images of various everyday objects (we termed this the implicit task). Although our main interest was the implicit task, to benchmark it and establish validity, we examined performance against a more standard visual feedback condition via a digital force-feedback meter on a video monitor (explicit task). Next, we progressed from visual force feedback with force meter targets to the same targets without visual force feedback (operating largely on feedforward control with tactile feedback). This provided an opportunity to see if Fitts’ law would hold without vision, and allowed us to progress toward the more naturalistic implicit task (which does not include visual feedback). Finally, we changed the nature of the targets from requiring explicit force values presented as arrows on a force-feedback meter (explicit targets) to the more naturalistic and intuitive target forces implied by images of objects (implicit targets). With visual force feedback the relation between task difficulty and the time to produce the target grip force was predicted by Fitts’ law (average r2 = 0.82). Without vision, average grip force scaled accurately although force variability was insensitive to the target presented. In contrast, images of everyday objects generated more reliable grip forces without the visualized force meter. In sum, population means were well-described by Fitts’ law for explicit targets with vision (r2 = 0.96) and implicit targets (r2 = 0.89), but not as well-described for explicit targets without vision (r2 = 0.54). Implicit targets should provide a realistic see-object-squeeze-object test using Fitts’ law to quantify the relative speed-accuracy relationship of any given grasper.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.