This paper presents a methodology to detect the intention to make a reaching movement with the arm in healthy subjects before the movement actually starts. This is done by measuring brain activity through electroencephalographic (EEG) signals that are registered by electrodes placed over the scalp. The preparation and performance of an arm movement generate a phenomenon called event-related desynchronization (ERD) in the mu and beta frequency bands. A novel methodology to characterize this cognitive process based on three sums of power spectral frequencies involved in ERD is presented. The main objective of this paper is to set the benchmark for classifiers and to choose the most convenient. The best results are obtained using an SVM classifier with around 72% accuracy. This classifier will be used in further research to generate the control commands to move a robotic exoskeleton that helps people suffering from motor disabilities to perform the movement. The final aim is that this brain-controlled robotic exoskeleton improves the current rehabilitation processes of disabled people.
Lower-limb robotic exoskeletons are wearable devices that can be beneficial for people with lower-extremity motor impairment because they can be valuable in rehabilitation or assistance. These devices can be controlled mentally by means of brain–machine interfaces (BMI). The aim of the present study was the design of a BMI based on motor imagery (MI) to control the gait of a lower-limb exoskeleton. The evaluation is carried out with able-bodied subjects as a preliminary study since potential users are people with motor limitations. The proposed control works as a state machine, i.e., the decoding algorithm is different to start (standing still) and to stop (walking). The BMI combines two different paradigms for reducing the false triggering rate (when the BMI identifies irrelevant brain tasks as MI), one based on motor imagery and another one based on the attention to the gait of the user. Research was divided into two parts. First, during the training phase, results showed an average accuracy of 68.44 ± 8.46% for the MI paradigm and 65.45 ± 5.53% for the attention paradigm. Then, during the test phase, the exoskeleton was controlled by the BMI and the average performance was 64.50 ± 10.66%, with very few false positives. Participants completed various sessions and there was a significant improvement over time. These results indicate that, after several sessions, the developed system may be employed for controlling a lower-limb exoskeleton, which could benefit people with motor impairment as an assistance device and/or as a therapeutic approach with very limited false activations.
Motor imagery (MI) is one of the most common paradigms used in brain-computer interfaces (BCIs). This mental process is defined as the imagination of movement without any motion. In some lower-limb exoskeletons controlled by BCIs, users have to perform MI continuously in order to move the exoskeleton. This makes it difficult to design a closed-loop control BCI, as it cannot be assured that the analyzed activity is not related to motion instead of imagery. A possible solution would be the employment of virtual reality (VR). During VR training phase, subjects could focus on MI avoiding any distraction. This could help the subject to create a robust model of the BCI classifier that would be used later to control the exoskeleton. This paper analyzes if gait MI can be improved when VR feedback is provided to subjects instead of visual feedback by a screen. Additionally, both types of visual feedback are analyzed while subjects are seated or standing up. From the analysis, visual feedback by VR was related to higher performances in the majority of cases, not being relevant the differences between standing and being seated. The paper also presents a case of study for the closed-loop control of the BCI in a virtual reality environment. Subjects had to perform gait MI or to be in a relaxation state and based on the output of the BCI, the immersive first person view remained static or started to move. Experiments showed an accuracy of issued commands of 91.0 ± 6.7, being a very satisfactory result.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.