Brain-machine interfaces (BMIs)1,2 use neuronal activity recorded from the brain to establish direct communication with external actuators, such as prosthetic arms. While BMIs aim to restore the normal sensorimotor functions of the limbs, so far they have lacked tactile sensation. Here we demonstrate the operation of a brain-machine-brain interface (BMBI) that both controls the exploratory reaching movements of an actuator and enables the signalling of artificial tactile feedback through intracortical microstimulation (ICMS) of the primary somatosensory cortex (S1). Monkeys performed an active-exploration task in which an actuator (a computer cursor or a virtual-reality hand) was moved using a BMBI that derived motor commands from neuronal ensemble activity recorded in primary motor cortex (M1). ICMS feedback occurred whenever the actuator touched virtual objects. Temporal patterns of ICMS encoded the artificial tactile properties of each object. Neuronal recordings and ICMS epochs were temporally multiplexed to avoid interference. Two monkeys operated this BMBI to search and discriminate one out of three visually undistinguishable objects, using the virtual hand to identify the unique artificial texture (AT) associated with each. These results suggest that clinical motor neuroprostheses might benefit from the addition of ICMS feedback to generate artificial somatic perceptions associated with mechanical, robotic, or even virtual prostheses.
Brain-machine interfaces (BMIs) provide a new assistive strategy aimed at restoring mobility in severely paralyzed patients. Yet, no study in animals or in human subjects has indicated that long-term BMI training could induce any type of clinical recovery. Eight chronic (3–13 years) spinal cord injury (SCI) paraplegics were subjected to long-term training (12 months) with a multi-stage BMI-based gait neurorehabilitation paradigm aimed at restoring locomotion. This paradigm combined intense immersive virtual reality training, enriched visual-tactile feedback, and walking with two EEG-controlled robotic actuators, including a custom-designed lower limb exoskeleton capable of delivering tactile feedback to subjects. Following 12 months of training with this paradigm, all eight patients experienced neurological improvements in somatic sensation (pain localization, fine/crude touch, and proprioceptive sensing) in multiple dermatomes. Patients also regained voluntary motor control in key muscles below the SCI level, as measured by EMGs, resulting in marked improvement in their walking index. As a result, 50% of these patients were upgraded to an incomplete paraplegia classification. Neurological recovery was paralleled by the reemergence of lower limb motor imagery at cortical level. We hypothesize that this unprecedented neurological recovery results from both cortical and spinal cord plasticity triggered by long-term BMI usage.
Brain-machine interfaces (BMIs) are artificial systems that aim to restore sensation and movement to severely paralyzed patients. However, previous BMIs enabled only single arm functionality, and control of bimanual movements was a major challenge. Here, we developed and tested a bimanual BMI that enabled rhesus monkeys to control two avatar arms simultaneously. The bimanual BMI was based on the extracellular activity of 374–497 neurons recorded from several frontal and parietal cortical areas of both cerebral hemispheres. Cortical activity was transformed into movements of the two arms with a decoding algorithm called a 5th order unscented Kalman filter (UKF). The UKF is well-suited for BMI decoding because it accounts for both characteristics of reaching movements and their representation by cortical neurons. The UKF was trained either during a manual task performed with two joysticks or by having the monkeys passively observe the movements of avatar arms. Most cortical neurons changed their modulation patterns when both arms were engaged simultaneously. Representing the two arms jointly in a single UKF decoder resulted in improved decoding performance compared with using separate decoders for each arm. As the animals’ performance in bimanual BMI control improved over time, we observed widespread plasticity in frontal and parietal cortical areas. Neuronal representation of the avatar and reach targets was enhanced with learning, whereas pairwise correlations between neurons initially increased and then decreased. These results suggest that cortical networks may assimilate the two avatar arms through BMI control.
The brain representation of the body, called the body schema, is susceptible to plasticity. For instance, subjects experiencing a rubber hand illusion develop a sense of ownership of a mannequin hand when they view it being touched while tactile stimuli are simultaneously applied to their own hand. Here, the cortical basis of such an embodiment was investigated through concurrent recordings from primary somatosensory (i.e., S1) and motor (i.e., M1) cortical neuronal ensembles while two monkeys observed an avatar arm being touched by a virtual ball. Following a period when virtual touches occurred synchronously with physical brushes of the monkeys' arms, neurons in S1 and M1 started to respond to virtual touches applied alone. Responses to virtual touch occurred 50 to 70 ms later than to physical touch, consistent with the involvement of polysynaptic pathways linking the visual cortex to S1 and M1. We propose that S1 and M1 contribute to the rubber hand illusion and that, by taking advantage of plasticity in these areas, patients may assimilate neuroprosthetic limbs as parts of their body schema.multielectrode recordings | cortical plasticity I n the early 1900s, Head and Holmes coined the concept of the "body schema" to describe the spatial model of the body that the brain builds based on sensory inputs from the skin, joints, and muscles, as well as visual and auditory signals (1). Numerous studies since then have explored different aspects of the body schema (2-6), particularly the role of cortical areas (7,8). The accumulated literature indicates that the body schema is plastic and can even incorporate artificial tools (5, 9, 10). A striking example of body schema plasticity is provided by the rubber hand illusion (RHI), in which subjects start to perceive a mannequin hand as their own after their real hand, hidden from sight, and the mannequin hand are repeatedly touched simultaneously (11-13). Subjects do not perceive a third limb, but report a shift in position sense from the real arm to the fake one (11-14), and there is even a decrease in skin temperature of the real arm (15). Incorporation of artificial limbs into the body schema began to be further explored with the advancement of brain machine interfaces (BMIs), hybrid systems that connect the brain with external devices (16)(17)(18)(19). Here, we recorded cortical ensemble activity in monkeys exposed to the paradigm that elicits RHI in humans (11)(12)(13)(14)20). ResultsMonkeys M and N were chronically implanted with microwire arrays in the primary motor (i.e., M1) and somatosensory (i.e., S1) cortical neuronal ensembles. They observed a 3D image of a virtual arm (i.e., an avatar arm) being touched by a virtual ball on an LCD screen while a robot slid a physical brush through the skin of their real arms ( Fig. 1 A and B). The virtual touch (V) and physical touch (P) were synchronous or asynchronous (Fig. 1C). In a subset of trials, virtual brushing occurred alone (i.e., Vonly).Excitatory and Inhibitory Responses to Physical Touch. Experiments with mon...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.