This research focuses on the minimum process of classifying three upper arm movements (elbow extension, shoulder extension, combined shoulder and elbow extension) of humans with three electromyography (EMG) signals, to control a 2-degrees of freedom (DoF) robotic arm. The proposed minimum process consists of four parts: time divisions of data, Teager–Kaiser energy operator (TKEO), the conventional EMG feature extraction (i.e., the mean absolute value (MAV), zero crossings (ZC), slope-sign changes (SSC), and waveform length (WL)), and eight major machine learning models (i.e., decision tree (medium), decision tree (fine), k-Nearest Neighbor (KNN) (weighted KNN, KNN (fine), Support Vector Machine (SVM) (cubic and fine Gaussian SVM), Ensemble (bagged trees and subspace KNN). Then, we compare and investigate 48 classification models (i.e., 47 models are proposed, and 1 model is the conventional) based on five healthy subjects. The results showed that all the classification models achieved accuracies ranging between 74–98%, and the processing speed is below 40 ms and indicated acceptable controller delay for robotic arm control. Moreover, we confirmed that the classification model with no time division, with TKEO, and with ensemble (subspace KNN) had the best performance in accuracy rates at 96.67, recall rates at 99.66, and precision rates at 96.99. In short, the combination of the proposed TKEO and ensemble (subspace KNN) plays an important role to achieve the EMG classification.
This article sought to address issues related to human-robot cooperation tasks focusing especially on robotic operation using bio-signals. In particular, we propose to develop a control scheme for a robot arm based on electromyography (EMG) signal that allows a cooperative task between humans and robots that would enable teleoperations. A basic framework for achieving the task and conducting EMG signals analysis of the motion of upper limb muscles for mapping the hand motion is presented. The objective of this work is to investigate the application of a wearable EMG device to control a robot arm in real-time. Three EMG sensors are attached to the brachioradialis, biceps brachii, and anterior deltoid muscles as targeted muscles. Three motions were conducted by moving the arm about the elbow joint, shoulder joint, and a combination of the two joints giving a two degree of freedom. Five subjects were used for the experiments. The results indicated that the performance of the system had an overall accuracy varying from 50% to 100% for the three motions for all subjects. This study has further shown that upper-limb motion discrimination can be used to control the robotic manipulator arm with its simplicity and low computational cost.
This paper sought to improve the precision of the Alternating Current Electro-Occulo-Graphy (AC-EOG) gaze estimation method. The method consisted of two core techniques: To estimate eyeball movement from EOG signals and to convert signals from the eyeball movement to the gaze position. In conventional research, the estimations are computed with two EOG signals corresponding to vertical and horizontal movements. The conversion is based on the affine transformation and those parameters are computed with 24-point gazing data at the calibration. However, the transformation is not applied to all the 24-point gazing data, but to four spatially separated data (Quadrant method), and each result has different characteristics. Thus, we proposed the conversion method for 24-point gazing data at the same time: To assume an imaginary center (i.e., 25th point) on gaze coordinates with 24-point gazing data and apply an affine transformation to 24-point gazing data. Then, we conducted a comparative investigation between the conventional method and the proposed method. From the results, the average eye angle error for the cross-shaped electrode attachment is x = 2.27 ° ± 0.46 ° and y = 1.83 ° ± 0.34 ° . In contrast, for the plus-shaped electrode attachment, the average eye angle error is is x = 0.94 ° ± 0.19 ° and y = 1.48 ° ± 0.27 ° . We concluded that the proposed method offers a simpler and more precise EOG gaze estimation than the conventional method.
This paper describes a man-machine interface system using EOG and EMG. A manipulator control system using EOG and EMG is developed according to EOG and EMG. With the eye movement, the system enabled us to control a manipulator. EOG is using for moving the robot joint angles and EMG is using for object grasping. The EOG and EMG discrimination method is used to control the robot. The robot arm joint movements are determined by the EOG discrimination method where the polarity of eye gaze motion signals in each Ch1 and Ch2. The EMG discrimination method is used to control arm gripper to grasp and release the target object. In the robot control experiment, we are successfully control the uArm TM robot by using both EOG and EMG discrimination method as the control input. This control system brings the feasibility of man-machine interface for elderly person and handicapped person.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.