Lower-limb powered prostheses can provide users with volitional control of ambulation. To accomplish this goal, they require a sensing modality that reliably interprets user intention to move. Surface electromyography (EMG) has been previously proposed to measure muscle excitation and provide volitional control to upper-and lower-limb powered prosthesis users. Unfortunately, EMG suffers from a low signal to noise ratio and crosstalk between neighboring muscles, often limiting the performance of EMG-based controllers. Ultrasound has been shown to have better resolution and specificity than surface EMG. However, this technology has yet to be integrated into lower-limb prostheses. Here we show that A-mode ultrasound sensing can reliably predict the prosthesis walking kinematics of individuals with a transfemoral amputation. Ultrasound features from the residual limb of 9 transfemoral amputee subjects were recorded with A-mode ultrasound during walking with their passive prosthesis. The ultrasound features were mapped to joint kinematics through a regression neural network. Testing of the trained model against untrained kinematics from an altered walking speed show accurate predictions of knee position, knee velocity, ankle position, and ankle velocity, with a normalized RMSE of 9.0 ± 3.1%, 7.3 ± 1.6%, 8.3 ± 2.3%, and 10.0 ± 2.5% respectively. This ultrasoundbased prediction suggests that A-mode ultrasound is a viable sensing technology for recognizing user intent. This study is the first necessary step towards implementation of volitional prosthesis controller based on A-mode ultrasound for individuals with transfemoral amputation.
Many people struggle with mobility impairments due to lower limb amputations. To participate in society, they need to be able to walk on a wide variety of terrains, such as stairs, ramps, and level ground. Current lower limb powered prostheses require different control strategies for varying ambulation modes, and use data from mechanical sensors within the prosthesis to determine which ambulation mode the user is in. However, it can be challenging to distinguish between ambulation modes. Efforts have been made to improve classification accuracy by adding electromyography information, but this requires a large number of sensors, has a low signal-to-noise ratio, and cannot distinguish between superficial and deep muscle activations. An alternative sensing modality, A-mode ultrasound, can detect and distinguish between changes in superficial and deep muscles. It has also shown promising results in upper limb gesture classification. Despite these advantages, A-mode ultrasound has yet to be employed for lower limb activity classification. Here we show that A- mode ultrasound can classify ambulation mode with comparable, and in some cases, superior accuracy to mechanical sensing. In this study, seven transfemoral amputee subjects walked on an ambulation circuit while wearing A-mode ultrasound transducers, IMU sensors, and their passive prosthesis. The circuit consisted of sitting, standing, level-ground walking, ramp ascent, ramp descent, stair ascent, and stair descent, and a spatial–temporal convolutional network was trained to continuously classify these seven activities. Offline continuous classification with A-mode ultrasound alone was able to achieve an accuracy of 91.8±3.4%, compared with 93.8±3.0%, when using kinematic data alone. Combined kinematic and ultrasound produced 95.8±2.3% accuracy. This suggests that A-mode ultrasound provides additional useful information about the user’s gait beyond what is provided by mechanical sensors, and that it may be able to improve ambulation mode classification. By incorporating these sensors into powered prostheses, users may enjoy higher reliability for their prostheses, and more seamless transitions between ambulation modes.
Volitional control systems for powered prostheses require the detection of user intent to operate in real life scenarios. Ambulation mode classification has been proposed to address this issue. However, these approaches introduce discrete labels to the otherwise continuous task that is ambulation. An alternative approach is to provide users with direct, voluntary control of the powered prosthesis motion. Surface electromyography (EMG) sensors have been proposed for this task, but poor signalto-noise ratios and crosstalk from neighboring muscles limit performance. B-mode ultrasound can address some of these issues at the cost of reduced clinical viability due to the substantial increase in size, weight, and cost. Thus, there is an unmet need for a lightweight, portable neural system that can effectively detect the movement intention of individuals with lower-limb amputation. Methods: In this study, we show that a small and lightweight A-mode ultrasound system can continuously predict prosthesis joint kinematics in seven individuals with transfemoral amputation across different ambulation tasks. Features from the A-mode ultrasound signals were mapped to the user's prosthesis kinematics via an artificial neural network. Results: Predictions on testing ambulation circuit trials resulted in a mean normalized RMSE across different ambulation modes of 8.7 ± 3.1%, 4.6 ± 2.5%, 7.2 ± 1.8%, and 4.6 ± 2.4% for knee position, knee velocity, ankle position, and ankle velocity, respectively. Conclusion and Significance: This study lays the foundation for future applications of A-mode ultrasound for volitional control of powered prostheses during a variety of daily ambulation tasks.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.