Assistive devices, like exoskeletons or orthoses, often make use of physiological data that allow the detection or prediction of movement onset. Movement onset can be detected at the executing site, the skeletal muscles, as by means of electromyography. Movement intention can be detected by the analysis of brain activity, recorded by, e.g., electroencephalography, or in the behavior of the subject by, e.g., eye movement analysis. These different approaches can be used depending on the kind of neuromuscular disorder, state of therapy or assistive device. In this work we conducted experiments with healthy subjects while performing self-initiated and self-paced arm movements. While other studies showed that multimodal signal analysis can improve the performance of predictions, we show that a sensible combination of electroencephalographic and electromyographic data can potentially improve the adaptability of assistive technical devices with respect to the individual demands of, e.g., early and late stages in rehabilitation therapy. In earlier stages for patients with weak muscle or motor related brain activity it is important to achieve high positive detection rates to support self-initiated movements. To detect most movement intentions from electroencephalographic or electromyographic data motivates a patient and can enhance her/his progress in rehabilitation. In a later stage for patients with stronger muscle or brain activity, reliable movement prediction is more important to encourage patients to behave more accurately and to invest more effort in the task. Further, the false detection rate needs to be reduced. We propose that both types of physiological data can be used in an and combination, where both signals must be detected to drive a movement. By this approach the behavior of the patient during later therapy can be controlled better and false positive detections, which can be very annoying for patients who are further advanced in rehabilitation, can be avoided.
The ability of today's robots to autonomously support humans in their daily activities is still limited. To improve this, predictive human-machine interfaces (HMIs) can be applied to better support future interaction between human and machine. To infer upcoming context-based behavior relevant brain states of the human have to be detected. This is achieved by brain reading (BR), a passive approach for single trial EEG analysis that makes use of supervised machine learning (ML) methods. In this work we propose that BR is able to detect concrete states of the interacting human. To support this, we show that BR detects patterns in the electroencephalogram (EEG) that can be related to event-related activity in the EEG like the P300, which are indicators of concrete states or brain processes like target recognition processes. Further, we improve the robustness and applicability of BR in application-oriented scenarios by identifying and combining most relevant training data for single trial classification and by applying classifier transfer. We show that training and testing, i.e., application of the classifier, can be carried out on different classes, if the samples of both classes miss a relevant pattern. Classifier transfer is important for the usage of BR in application scenarios, where only small amounts of training examples are available. Finally, we demonstrate a dual BR application in an experimental setup that requires similar behavior as performed during the teleoperation of a robotic arm. Here, target recognition processes and movement preparation processes are detected simultaneously. In summary, our findings contribute to the development of robust and stable predictive HMIs that enable the simultaneous support of different interaction behaviors.
In neuroscience large amounts of data are recorded to provide insights into cerebral information processing and function. The successful extraction of the relevant signals becomes more and more challenging due to increasing complexities in acquisition techniques and questions addressed. Here, automated signal processing and machine learning tools can help to process the data, e.g., to separate signal and noise. With the presented software pySPACE (http://pyspace.github.io/pyspace), signal processing algorithms can be compared and applied automatically on time series data, either with the aim of finding a suitable preprocessing, or of training supervised algorithms to classify the data. pySPACE originally has been built to process multi-sensor windowed time series data, like event-related potentials from the electroencephalogram (EEG). The software provides automated data handling, distributed processing, modular build-up of signal processing chains and tools for visualization and performance evaluation. Included in the software are various algorithms like temporal and spatial filters, feature generation and selection, classification algorithms, and evaluation schemes. Further, interfaces to other signal processing tools are provided and, since pySPACE is a modular framework, it can be extended with new algorithms according to individual needs. In the presented work, the structural hierarchies are described. It is illustrated how users and developers can interface the software and execute offline and online modes. Configuration of pySPACE is realized with the YAML format, so that programming skills are not mandatory for usage. The concept of pySPACE is to have one comprehensive tool that can be used to perform complete signal processing and classification tasks. It further allows to define own algorithms, or to integrate and use already existing libraries.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.