Upper limb amputation is a condition that severely limits the amputee’s movement. Patients who have lost the use of one or more of their upper extremities have difficulty performing activities of daily living. To help improve the control of upper limb prosthesis with pattern recognition, non-invasive approaches (EEG and EMG signals) is proposed in this paper and are integrated with machine learning techniques to recognize the upper-limb motions of subjects. EMG and EEG signals are combined, and five features are utilized to classify seven hand movements such as (wrist flexion (WF), outward part of the wrist (WE), hand open (HO), hand close (HC), pronation (PRO), supination (SUP), and rest (RST)). Experiments demonstrate that using mean absolute value (MAV), waveform length (WL), Wilson Amplitude (WAMP), Sine Slope Changes (SSC), and Cardinality features of the proposed algorithm achieves a classification accuracy of 89.6% when classifying seven distinct types of hand and wrist movement.
Index Terms— Human Robot Interaction, Bio-signals Analysis, LDA classifier.