Options for people with severe paralysis who have lost the ability to communicate orally are limited. We describe a method for communication in a patient with late-stage amyotrophic lateral sclerosis (ALS), involving a fully implanted brain-computer interface that consists of subdural electrodes placed over the motor cortex and a transmitter placed subcutaneously in the left side of the thorax. By attempting to move the hand on the side opposite the implanted electrodes, the patient accurately and independently controlled a computer typing program 28 weeks after electrode placement, at the equivalent of two letters per minute. The brain-computer interface offered autonomous communication that supplemented and at times supplanted the patient's eye-tracking device. (Funded by the Government of the Netherlands and the European Union; ClinicalTrials.gov number, NCT02224469 .).
Electrocorticography (ECoG) based Brain-Computer Interfaces (BCIs) have been proposed as a way to restore and replace motor function or communication in severely paralyzed people. To date, most motor-based BCIs have either focused on the sensorimotor cortex as a whole or on the primary motor cortex (M1) as a source of signals for this purpose. Still, target areas for BCI are not confined to M1, and more brain regions may provide suitable BCI control signals. A logical candidate is the primary somatosensory cortex (S1), which not only shares similar somatotopic organization to M1, but also has been suggested to have a role beyond sensory feedback during movement execution. Here, we investigated whether four complex hand gestures, taken from the American sign language alphabet, can be decoded exclusively from S1 using both spatial and temporal information. For decoding, we used the signal recorded from a small patch of cortex with subdural high-density (HD) grids in five patients with intractable epilepsy. Notably, we introduce a new method of trial alignment based on the increase of the electrophysiological response, which virtually eliminates the confounding effects of systematic and non-systematic temporal differences within and between gestures execution. Results show that S1 classification scores are high (76%), similar to those obtained from M1 (74%) and sensorimotor cortex as a whole (85%), and significantly above chance level (25%). We conclude that S1 offers characteristic spatiotemporal neuronal activation patterns that are discriminative between gestures, and that it is possible to decode gestures with high accuracy from a very small patch of cortex using subdurally implanted HD grids. The feasibility of decoding hand gestures using HD-ECoG grids encourages further investigation of implantable BCI systems for direct interaction between the brain and external devices with multiple degrees of freedom.
The mechanism(s) by which anesthetics reversibly suppress consciousness are incompletely understood. Previous functional imaging studies demonstrated dynamic changes in thalamic and cortical metabolic activity, as well as the maintained presence of metabolically defined functional networks despite the loss of consciousness. However, the invasive electrophysiology associated with these observations has yet to be studied. By recording electrical activity directly from the cortical surface, electrocorticography (ECoG) provides a powerful method to integrate spatial, temporal, and spectral features of cortical electrophysiology not possible with noninvasive approaches. In this study, we report a unique comprehensive recording of invasive human cortical physiology during both induction and emergence from propofol anesthesia. Propofolinduced transitions in and out of consciousness (defined here as responsiveness) were characterized by maintained large-scale functional networks defined by correlated fluctuations of the slow cortical potential (<0.5 Hz) over the somatomotor cortex, present even in the deeply anesthetized state of burst suppression. Similarly, phase-power coupling between θ-and γ-range frequencies persisted throughout the induction and emergence from anesthesia. Superimposed on this preserved functional architecture were alterations in frequency band power, variance, covariance, and phase-power interactions that were distinct to different frequency ranges and occurred in separable phases. These data support that dynamic alterations in cortical and thalamocortical circuit activity occur in the context of a larger stable architecture that is maintained despite anesthetic-induced alterations in consciousness.cortical networks | human cortex | gamma rhythms E very year millions of people undergo general anesthesia, yet the mechanism(s) by which widely used clinical anesthetics reversibly ablate consciousness remains incompletely understood (1). Moreover, the manner in which the brain is able to tolerate global pharmacologic suppression, yet still maintain memories and resume complex cortical interactions that define a person's cognition after removal of this suppression, also remains unknown. Thus far, the majority of studies in humans have used noninvasive methods such as functional imaging and electroencephalography (EEG) to arrive at the current understanding. To date, positron emission tomography (PET) and functional magnetic resonance imaging (fMRI) studies show that there is a complex interplay between and within the thalamus and the cortex. These studies demonstrate that the thalamus is a common site of deactivation during induction by various anesthetic agents (2,3), that there appears to be a disruption of thalamo-cortical and cortico-cortical connectivity (4, 5), and that specific regions of association cortices show enhanced deactivation with certain anesthetics (6, 7). In parallel with these dynamic interactions, there also appear to be physiologic elements that are invariant and do not change wi...
The increasing understanding of human brain functions makes it possible to directly interact with the brain for therapeutic purposes. Implantable brain computer interfaces promise to replace or restore motor functions in patients with partial or complete paralysis. We postulate that neuronal states associated with gestures, as they are used in the finger spelling alphabet of sign languages, provide an excellent signal for implantable brain computer interfaces to restore communication. To test this, we evaluated decodability of four gestures using high-density electrocorticography in two participants. The electrode grids were located subdurally on the hand knob area of the sensorimotor cortex covering a surface of 2.5–5.2 cm2. Using a pattern-matching classification approach four types of hand gestures were classified based on their pattern of neuronal activity. In the two participants the gestures were classified with 97 and 74 % accuracy. The high frequencies (>65 Hz) allowed for the best classification results. This proof-of-principle study indicates that the four gestures are associated with a reliable and discriminable spatial representation on a confined area of the sensorimotor cortex. This robust representation on a small area makes hand gestures an interesting control feature for an implantable BCI to restore communication for severely paralyzed people.
For people who cannot communicate due to severe paralysis or involuntary movements, technology that decodes intended speech from the brain may offer an alternative means of communication. If decoding proves to be feasible, intracranial Brain-Computer Interface systems can be developed which are designed to translate decoded speech into computer generated speech or to instructions for controlling assistive devices. Recent advances suggest that such decoding may be feasible from sensorimotor cortex, but it is not clear how this challenge can be approached best. One approach is to identify and discriminate elements of spoken language, such as phonemes. We investigated feasibility of decoding four spoken phonemes from the sensorimotor face area, using electrocorticographic signals obtained with high-density electrode grids. Several decoding algorithms including spatiotemporal matched filters, spatial matched filters and support vector machines were compared. Phonemes could be classified correctly at a level of over 75% with spatiotemporal matched filters. Support Vector machine analysis reached a similar level, but spatial matched filters yielded significantly lower scores. The most informative electrodes were clustered along the central sulcus. Highest scores were achieved from time windows centered around voice onset time, but a 500 ms window before onset time could also be classified significantly. The results suggest that phoneme production involves a sequence of robust and reproducible activity patterns on the cortical surface. Importantly, decoding requires inclusion of temporal information to capture the rapid shifts of robust patterns associated with articulator muscle group contraction during production of a phoneme. The high classification scores are likely to be enabled by the use of high density grids, and by the use of discrete phonemes. Implications for use in Brain-Computer Interfaces are discussed.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.