Surface electromyography (sEMG) has been the predominant method for sensing electrical activity for a number of applications involving muscle-computer interfaces, including myoelectric control of prostheses and rehabilitation robots. Ultrasound imaging for sensing mechanical deformation of functional muscle compartments can overcome several limitations of sEMG, including the inability to differentiate between deep contiguous muscle compartments, low signal-to-noise ratio, and lack of a robust graded signal. The objective of this study was to evaluate the feasibility of real-time graded control using a computationally efficient method to differentiate between complex hand motions based on ultrasound imaging of forearm muscles. Dynamic ultrasound images of the forearm muscles were obtained from six able-bodied volunteers and analyzed to map muscle activity based on the deformation of the contracting muscles during different hand motions. Each participant performed 15 different hand motions, including digit flexion, different grips (i.e., power grasp and pinch grip), and grips in combination with wrist pronation. During the training phase, we generated a database of activity patterns corresponding to different hand motions for each participant. During the testing phase, novel activity patterns were classified using a nearest neighbor classification algorithm based on that database. The average classification accuracy was 91%. Real-time image-based control of a virtual hand showed an average classification accuracy of 92%. Our results demonstrate the feasibility of using ultrasound imaging as a robust muscle-computer interface. Potential clinical applications include control of multiarticulated prosthetic hands, stroke rehabilitation, and fundamental investigations of motor control and biomechanics.
Movement adaptation in response to systematic motor perturbations exhibits distinct spatial and temporal properties. These characteristics are typically studied in isolation, leaving the interaction largely unknown. Here we examined how the temporal decay of visuomotor adaptation influences the spatial generalization of the motor recalibration. First, we quantified the extent to which adaptation decayed over time. Subjects reached to a peripheral target, and a rotation was applied to the visual feedback of the unseen motion. The retention of this adaptation over different delays (0-120 s) ) decreased by 29.0 ± 6.8% at the longest delay and) was represented by a simple exponential, with a time constant of 22.5 ± 5.6 s. On the basis of this relationship we simulated how the spatial generalization of adaptation would change with delay. To test this directly, we trained additional subjects with the same perturbation and assessed transfer to 19 different locations (spaced 15° apart, symmetric around the trained location) and examined three delays (~4, 12, and 25 s). Consistent with the simulation, we found that generalization around the trained direction (±15°) significantly decreased with delay and distance, while locations >60° displayed near-constant spatiotemporal transfer. Intermediate distances (30° and 45°) showed a difference in transfer across space, but this amount was approximately constant across time. Interestingly, the decay at the trained direction was faster than that based purely on time, suggesting that the spatial transfer of adaptation is modified by concurrent passive (time dependent) and active (movement dependent) processes. Short-term motor adaptation exhibits distinct spatial and temporal characteristics. Here we investigated the interaction of these features, utilizing a simple motor adaptation paradigm (recalibration of reaching arm movements in response to rotated visual feedback). We examined the changes in the spatial generalization of motor adaptation for different temporal manipulations and report that the spatiotemporal generalization of motor adaptation is generally local and is influenced by both passive (time dependent) and active (movement dependent) learning processes.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.