Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
In the evolving field of human–computer interaction (HCI), gesture recognition has emerged as a critical focus, with smart gloves equipped with sensors playing one of the most important roles. Despite the significance of dynamic gesture recognition, most research on data gloves has concentrated on static gestures, with only a small percentage addressing dynamic gestures or both. This study explores the development of a low-cost smart glove prototype designed to capture and classify dynamic hand gestures for game control and presents a prototype of data gloves equipped with five flex sensors, five force sensors, and one inertial measurement unit (IMU) sensor. To classify dynamic gestures, we developed a neural network-based classifier, utilizing a convolutional neural network (CNN) with three two-dimensional convolutional layers and rectified linear unit (ReLU) activation where its accuracy was 90%. The developed glove effectively captures dynamic gestures for game control, achieving high classification accuracy, precision, and recall, as evidenced by the confusion matrix and training metrics. Despite limitations in the number of gestures and participants, the solution offers a cost-effective and accurate approach to gesture recognition, with potential applications in VR/AR environments.
In the evolving field of human–computer interaction (HCI), gesture recognition has emerged as a critical focus, with smart gloves equipped with sensors playing one of the most important roles. Despite the significance of dynamic gesture recognition, most research on data gloves has concentrated on static gestures, with only a small percentage addressing dynamic gestures or both. This study explores the development of a low-cost smart glove prototype designed to capture and classify dynamic hand gestures for game control and presents a prototype of data gloves equipped with five flex sensors, five force sensors, and one inertial measurement unit (IMU) sensor. To classify dynamic gestures, we developed a neural network-based classifier, utilizing a convolutional neural network (CNN) with three two-dimensional convolutional layers and rectified linear unit (ReLU) activation where its accuracy was 90%. The developed glove effectively captures dynamic gestures for game control, achieving high classification accuracy, precision, and recall, as evidenced by the confusion matrix and training metrics. Despite limitations in the number of gestures and participants, the solution offers a cost-effective and accurate approach to gesture recognition, with potential applications in VR/AR environments.
Gesture recognition has become a significant part of human–machine interaction, particularly when verbal interaction is not feasible. The rapid development of biomedical sensing and machine learning algorithms, including electromyography (EMG) and convolutional neural networks (CNNs), has enabled the interpretation of sign languages, including the Polish Sign Language, based on EMG signals. The objective was to classify the game control gestures and Polish Sign Language gestures recorded specifically for this study using two different data acquisition systems: BIOPAC MP36 and MyoWare 2.0. We compared the classification performance of various machine learning algorithms, with a particular emphasis on CNNs on the dataset of EMG signals representing 24 gestures, recorded using both types of EMG sensors. The results (98.324% versus ≤7.8571% and 95.5307% versus ≤10.2697% of accuracy for CNNs and other classifiers in data recorded with BIOPAC MP36 and MyoWare, respectively) indicate that CNNs demonstrate superior accuracy. These results suggest the feasibility of using lower-cost sensors for effective gesture classification and the viability of integrating affordable EMG-based technologies into broader gesture recognition frameworks, providing a cost-effective solution for real-world applications. The dataset created during the study offers a basis for future studies on EMG-based recognition of Polish Sign Language.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.