This paper investigates hand gesture recognition from acoustic measurements at wrist for the development of a low-cost wearable human-computer interaction (HCI) device. A prototype with 5 microphone sensors on human wrist is benchmarked in hand gesture recognition performance by identifying 36 gestures in American Sign Language (ASL). Three subjects were recruited to perform over 20 trials for each set of hand gestures, including 26 ASL alphabets and 10 ASL numbers. Ten features were extracted from the signal recorded by each sensor. Support Vector Machine (SVM), Decision Tree (DT), K-Nearest Neighbors (kNN), and Linear Discriminant Analysis (LDA) were compared in classification performance. Among which, LDA offered the highest average classification accuracy above 80%. Based on these preliminary results, our proposed technique has exhibited a promising means for developing a low-cost HCI.
To facilitate hand gesture recognition, we investigated the use of acoustic signals with an accelerometer and gyroscope at the human wrist. As a proof-of-concept, the prototype consisted of 10 microphone units in contact with the skin placed around the wrist along with an inertial measurement unit (IMU). The gesture recognition performance was evaluated through the identification of 13 gestures used in daily life. The optimal area for acoustic sensor placement at the wrist was examined using the minimum redundancy and maximum relevance feature selection algorithm. We recruited 10 subjects to perform over 10 trials for each set of hand gestures. The accuracy was 75% for a general model with the top 25 features selected, and the intra-subject average classification accuracy was over 80% with the same features using one microphone unit at the mid-anterior wrist and an IMU. These results indicate that acoustic signatures from the human wrist can aid IMU sensing for hand gesture recognition, and the selection of a few common features for all subjects could help with building a general model. The proposed multimodal framework helps address the single IMU sensing bottleneck for hand gestures during arm movement and/or locomotion.
OPEN ACCESSCitation: Siddiqui N, Chan RHM (2020) Multimodal hand gesture recognition using single IMU and acoustic measurements at wrist. PLoS ONE 15(1): e0227039. https://doi.org/10.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.