It is generally thought that the signal-to-noise ratio, the bandwidth, and the information content of neural data acquired via noninvasive scalp electroencephalography (EEG) are insufficient to extract detailed information about natural, multijoint movements of the upper limb. Here, we challenge this assumption by continuously decoding three-dimensional (3D) hand velocity from neural data acquired from the scalp with 55-channel EEG during a 3D center-out reaching task. To preserve ecological validity, five subjects self-initiated reaches and self-selected targets. Eye movements were controlled so they would not confound the interpretation of the results. With only 34 sensors, the correlation between measured and reconstructed velocity profiles compared reasonably well to that reported by studies that decoded hand kinematics from neural activity acquired intracranially. We subsequently examined the individual contributions of EEG sensors to decoding to find substantial involvement of scalp areas over the sensorimotor cortex contralateral to the reaching hand. Using standardized low-resolution brain electromagnetic tomography (sLORETA), we identified distributed current density sources related to hand velocity in the contralateral precentral gyrus, postcentral gyrus, and inferior parietal lobule. Furthermore, we discovered that movement variability negatively correlated with decoding accuracy, a findingtoconsiderduringthedevelopmentofbrain-computerinterfacesystems.Overall,theabilitytocontinuouslydecode3Dhandvelocityfrom EEG during natural, center-out reaching holds promise for the furtherance of noninvasive neuromotor prostheses for movement-impaired individuals.
. Although there is converging experimental and clinical evidences suggesting that mental training with motor imagery can improve motor performance, it is unclear how humans can learn movements through mental training despite the lack of sensory feedback from the body and the environment. In a first experiment, we measured the trial-by-trial decrease in durations of executed movements (physical training group) and mentally simulated movements (motor-imagery training group), by means of training on a multiple-target arm-pointing task requiring high accuracy and speed. Movement durations were significantly lower in posttest compared with pretest after both physical and motor-imagery training. Although both the posttraining performance and the rate of learning were smaller in motor-imagery training group than in physical training group, the change in movement duration and the asymptotic movement duration after a hypothetical large number of trials were identical. The two control groups (eye-movement training and rest groups) did not show change in movement duration. In the second experiment, additional kinematic analyses revealed that arm movements were straighter and faster both immediately and 24 h after physical and motor-imagery training. No such improvements were observed in the eye-movement training group. Our results suggest that the brain uses state estimation, provided by internal forward model predictions, to improve motor performance during mental training. Furthermore, our results suggest that mental practice can, at least in young healthy subjects and if given after a short bout of physical practice, be successfully substituted to physical practice to improve motor performance.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.