Motor imagery is a popular technique employed as a motor rehabilitation tool, or to control assistive devices to substitute lost motor function. In both said areas of application, artificial somatosensory input helps to mirror the sensorimotor loop by providing kinesthetic feedback or guidance in a more intuitive fashion than via visual input. In this work, we study directional and movement-related information in electroencephalographic signals acquired during a visually guided center-out motor imagery task in two conditions, i.e., with and without additional somatosensory input in the form of vibrotactile guidance. Imagined movements to the right and forward could be discriminated in low-frequency electroencephalographic amplitudes with group level peak accuracies of 70% with vibrotactile guidance, and 67% without vibrotactile guidance. The peak accuracies with and without vibrotactile guidance were not significantly different. Furthermore, the motor imagery could be classified against a resting baseline with group level accuracies between 76 and 83%, using either low-frequency amplitude features or μ and β power spectral features. On average, accuracies were higher with vibrotactile guidance, while this difference was only significant in the latter set of features. Our findings suggest that directional information in low-frequency electroencephalographic amplitudes is retained in the presence of vibrotactile guidance. Moreover, they hint at an enhancing effect on motor-related μ and β spectral features when vibrotactile guidance is provided.
Establishing the basic knowledge, methodology, and technology for a framework for the continuous decoding of hand/arm movement intention was the aim of the ERC-funded project “Feel Your Reach”. In this work, we review the studies and methods we performed and implemented in the last 6 years, which build the basis for enabling severely paralyzed people to non-invasively control a robotic arm in real-time from electroencephalogram (EEG). In detail, we investigated goal-directed movement detection, decoding of executed and attempted movement trajectories, grasping correlates, error processing, and kinesthetic feedback. Although we have tested some of our approaches already with the target populations, we still need to transfer the “Feel Your Reach” framework to people with cervical spinal cord injury and evaluate the decoders’ performance while participants attempt to perform upper-limb movements. While on the one hand, we made major progress towards this ambitious goal, we also critically discuss current limitations.
Brain Computer Interfaces (BCI) are devices that use brain signals for control or communication. Since they don't require movement of any part of the body, BCI are the natural choice for assisted communication when a person is unable to move.In this article, BCI based communicator for persons in locked-in state is described. It is based on P300 brain response of the user, thus does not require prior training, movement or imagination of movement. Auditory paradigm is selected in order to apply the communicator in cases where visual ability is also impaired. The communicator was designed to prove also whether low cost hardware with reduced electrode set could be used efficiently in everyday environment, without the need for expert personnel.The design of the communicator is described first, followed by detailed analyses of the performance when used by either healthy or disabled subjects. It is shown that auditory paradigm is the primary factor that limits the accuracy of communication. Hardware characteristics and reduced electrode set influence the accuracy in a negative way as well, while different questions and answer types produce no major differences.
Background Motor imagery is a cognitive process of imagining a performance of a motor task without employing the actual movement of muscles. It is often used in rehabilitation and utilized in assistive technologies to control a brain–computer interface (BCI). This paper provides a comparison of different time–frequency representations (TFR) and their Rényi and Shannon entropies for sensorimotor rhythm (SMR) based motor imagery control signals in electroencephalographic (EEG) data. The motor imagery task was guided by visual guidance, visual and vibrotactile (somatosensory) guidance or visual cue only. Results When using TFR-based entropy features as an input for classification of different interaction intentions, higher accuracies were achieved (up to 99.87%) in comparison to regular time-series amplitude features (for which accuracy was up to 85.91%), which is an increase when compared to existing methods. In particular, the highest accuracy was achieved for the classification of the motor imagery versus the baseline (rest state) when using Shannon entropy with Reassigned Pseudo Wigner–Ville time–frequency representation. Conclusions Our findings suggest that the quantity of useful classifiable motor imagery information (entropy output) changes during the period of motor imagery in comparison to baseline period; as a result, there is an increase in the accuracy and F1 score of classification when using entropy features in comparison to the accuracy and the F1 of classification when using amplitude features, hence, it is manifested as an improvement of the ability to detect motor imagery.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.