2022
DOI: 10.1109/access.2022.3170483
|View full text |Cite
|
Sign up to set email alerts
|

Forearm Orientation and Muscle Force Invariant Feature Selection Method for Myoelectric Pattern Recognition

Abstract: Electromyogram (EMG) signal-based prosthetic hand can restore an amputee's missing functionalities, which requires a faithful electromyogram pattern recognition (EMG-PR) system. However, forearm orientation and muscle force variation make the EMG-PR system more complex, and the problem becomes more complicated when muscle force levels and forearm orientations arise simultaneously. The problems can be minimized using a more significant number of features or high-density surface EMG, but it increases design com… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 44 publications
0
2
0
Order By: Relevance
“…When there is a high number of features and/or channels (sensors) to describe the signal, it is said that it has a high dimensionality, which is not always recommended when classifying, so there are algorithms for its reduction [6], [7], [8]. In this sense, there are two main methods: feature selection algorithms and feature reduction algorithms.…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…When there is a high number of features and/or channels (sensors) to describe the signal, it is said that it has a high dimensionality, which is not always recommended when classifying, so there are algorithms for its reduction [6], [7], [8]. In this sense, there are two main methods: feature selection algorithms and feature reduction algorithms.…”
mentioning
confidence: 99%
“…Filter methods have low computational costs. However, selected features do not achieve good classification performance as they can sometimes miss some critical assumptions about the underlying regression function linking input variables to the output [7], [8]. The best-known filters are Information Gain, Gain Ratio, Term Variance, Mutual Information, Gini Index, Laplacian Score, Relief-F, and Fisher Score, among others.…”
mentioning
confidence: 99%