Certain differences between brain networks of healthy and epilectic subjects have been reported even during the interictal activity, in which no epileptic seizures occur. Here, magnetoencephalography (MEG) data recorded in the resting state is used to discriminate between healthy subjects and patients with either idiopathic generalized epilepsy or frontal focal epilepsy. Signal features extracted from interictal periods without any epileptiform activity are used to train a machine learning algorithm to draw a diagnosis. This is potentially relevant to patients without frequent or easily detectable spikes. To analyze the data, we use an up-to-date machine learning algorithm and explore the benefits of including different features obtained from the MEG data as inputs to the algorithm. We find that the relative power spectral density of the MEG time-series is sufficient to distinguish between healthy and epileptic subjects with a high prediction accuracy. We also find that a combination of features such as the phase-locked value and the relative power spectral density allow to discriminate generalized and focal epilepsy, when these features are calculated over a filtered version of the signals in certain frequency bands. Machine learning algorithms are currently being applied to the analysis and classification of brain signals. It is, however, less evident to identify the proper features of these signals that are prone to be used in such machine learning algorithms. Here, we evaluate the influence of the input feature selection on a clinical scenario to distinguish between healthy and epileptic subjects. Our results indicate that such distinction is possible with a high accuracy (86%), allowing the discrimination between idiopathic generalized and frontal focal epilepsy types.
Sensorimotor learning refers to improvements that occur through practice in the performance of sensory-guided motor behaviors. Leveraging novel technical capabilities of an immersive virtual environment, we probed the component kinematic processes that mediate sensorimotor learning. Twenty naïve subjects performed a simulated marksmanship task modeled after Olympic Trap Shooting standards. We measured movement kinematics and shooting performance as participants practiced 350 trials while receiving trial-by-trial feedback about shooting success. Spatiotemporal analysis of motion tracking elucidated the ballistic and refinement phases of hand movements. We found systematic changes in movement kinematics that accompanied improvements in shot accuracy during training, though reaction and response times did not change over blocks. In particular, we observed longer, slower, and more precise ballistic movements that replaced effort spent on corrections and refinement. Collectively, these results leverage developments in immersive virtual reality technology to quantify and compare the kinematics of movement during early learning of full-body sensorimotor orienting.
Using dry electrodes is desirable for reduced set-up time; however, this study demonstrates that online performance is significantly poorer than for wet electrodes for users with and without disabilities. We test a new application of dynamic stopping algorithms to compensate for poorer SNR. Dynamic stopping improved dry system performance; however, further signal processing efforts are likely necessary for full mitigation.
The fusion of immersive virtual reality, kinematic movement tracking, and EEG offers a powerful test bed for naturalistic neuroscience research. Here, we combined these elements to investigate the neuro-behavioral mechanisms underlying precision visual–motor control as 20 participants completed a three-visit, visual–motor, coincidence-anticipation task, modeled after Olympic Trap Shooting and performed in immersive and interactive virtual reality. Analyses of the kinematic metrics demonstrated learning of more efficient movements with significantly faster hand RTs, earlier trigger response times, and higher spatial precision, leading to an average of 13% improvement in shot scores across the visits. As revealed through spectral and time-locked analyses of the EEG beta band (13–30 Hz), power measured prior to target launch and visual-evoked potential amplitudes measured immediately after the target launch correlate with subsequent reactive kinematic performance in the shooting task. Moreover, both launch-locked and shot/feedback-locked visual-evoked potentials became earlier and more negative with practice, pointing to neural mechanisms that may contribute to the development of visual–motor proficiency. Collectively, these findings illustrate EEG and kinematic biomarkers of precision motor control and changes in the neurophysiological substrates that may underlie motor learning.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.