The search for optimized forms of human-computer interaction (HCI) has intensified alongside the growing potential for the combination of biosignals with virtual reality (VR) and augmented reality (AR) to enable the next generation of personal computing. At the core, this requires decoding the user's biosignals into digital commands. Electromyography (EMG) is a biosensor of particular interest due to the ease of data collection, the relatively high signal-to-noise-ratio, its non-invasiveness, and the ability to interpret the signal as being generated by (intentional) muscle activity. Here, we investigate the potential of using data taken from a simple 2-channel EMG setup to differentiate 5 distinct movements. In particular, EMG was recorded from two bipolar sensors over small hand muscles (extensor digitorum, flexor digitorum profundus) while a subject performed 50 trials of dorsal extension and return for each of the five digits. The maximum and the mean data values across the trial were determined for each channel and used as features. A k-nearest neighbors (kNN) classification was performed and overall 5-class classification accuracy reached 94% when using the full trial's time window, while simulated real-time classification reached 90.4% accuracy when using the constructed kNN model (k=3) with a 280ms sliding window. Additionally, unsupervised learning was performed and a homogeneity of 85% was achieved. This study demonstrates that reliable decoding of different natural movements is possible with fewer than one channel per class, even without taking into account temporal features of the signal. The technical feasibility of this approach in a real-time setting was validated by sending real-time EMG data to a custom Unity3D VR application through a Lab Streaming Layer to control a user interface. Further use-cases of gamification and rehabilitation were also examined alongside integration of eye-tracking and gesture recognition for a sensor fusion approach to HCI and user intent.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.