2020
DOI: 10.1142/s2424905x20410019
|View full text |Cite
|
Sign up to set email alerts
|

Surface EMG-Based Hand Gesture Recognition via Hybrid and Dilated Deep Neural Network Architectures for Neurorobotic Prostheses

Abstract: Motivated by the potentials of deep learning models in significantly improving myoelectric control of neuroprosthetic robotic limbs, this paper proposes two novel deep learning architectures, namely the [Formula: see text] ([Formula: see text]) and the [Formula: see text] ([Formula: see text]), for performing Hand Gesture Recognition (HGR) via multi-channel surface Electromyography (sEMG) signals. The work is aimed at enhancing the accuracy of myoelectric systems, which can be used for realizing an accurate an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
25
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 25 publications
(26 citation statements)
references
References 11 publications
1
25
0
Order By: Relevance
“…A common strategy used for hand gesture recognition in recent works is applying DNN with the focus on improving hand gestures classification performance on "never-seenbefore repetitions". Along this line of research, several stateof-the-art works [10]- [12], [14], [16]- [22] mainly used the Ninapro database [23]- [25], which is a public dataset providing kinematic and sEMG signals from 52 finger, hand, and wrist movements. The Ninapro database is similar to data obtained in real-world conditions, and as such it allows development of advanced DNN-based recognition frameworks.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…A common strategy used for hand gesture recognition in recent works is applying DNN with the focus on improving hand gestures classification performance on "never-seenbefore repetitions". Along this line of research, several stateof-the-art works [10]- [12], [14], [16]- [22] mainly used the Ninapro database [23]- [25], which is a public dataset providing kinematic and sEMG signals from 52 finger, hand, and wrist movements. The Ninapro database is similar to data obtained in real-world conditions, and as such it allows development of advanced DNN-based recognition frameworks.…”
Section: Related Workmentioning
confidence: 99%
“…The common approach in recent studies [10]- [12], [14], [16]- [22], following the recommendations provided by the Ninapro database, is to train DNN-based models on a training set consisting of approximately 2/3 of the gesture trials of each subject. The evaluation is then performed on the remaining trials constituting the test set.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…The captured image is enhanced using contrast correction, then converted to grey scale and subsequently processed for edge detection purposes. While more complex techniques are available [34,35], the computational intensity and challenges in implementing them into embedded code systems on board of the Bebionic hand dictated the implementation of leaner algorithms. In this paper, we utilized light image processing approaches to simplify the computation and maximize practicality and translation to the clinic.…”
Section: Computer Vision Subsystemmentioning
confidence: 99%
“…Recent evolution in Machine Learning (ML) and Deep Neural Networks (DNNs) coupled with advancements of neuro-rehabilitation technologies, have paved the way for the development of new control systems for myoelectric prostheses. In this regard, surface Electromyogram (sEMG) signals [1][2][3][4][5] is, typically, considered as the input for the control system of prostheses. In particular, the use of machine intelligence for sEMG-based Hand Gesture Recognition (HGR) has been the focus of literature due to its unique potentials to improve the functionality of control and consequently increase the quality of life of individuals with the lack of a biological limb.…”
Section: Introductionmentioning
confidence: 99%