2021
DOI: 10.1523/eneuro.0007-21.2021
|View full text |Cite
|
Sign up to set email alerts
|

Behavioral and Neural Variability of Naturalistic Arm Movements

Abstract: for discussions and help with data collection and analysis; neurosurgeons Jeffrey G. Ojemann and Andrew Ko, as well as the excellent staff at Harborview Hospital Neurosurgery department, for their care of the patients during their monitoring.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 8 publications
(16 citation statements)
references
References 103 publications
(98 reference statements)
0
16
0
Order By: Relevance
“…We assessed cross-modal, self-supervised decoding performance on four datasets (table 1) and demonstrate in each case that cross-modal decoding outperforms unimodal, self-supervised models and approaches the accuracy of supervised models. We consider three movement decoding tasks: determining whether a participant's arm was moving or at rest (ECoG move/rest [56,77] and EEG move/rest [73]), predicting which of five fingers was being flexed (ECoG finger flexion [40,74]), and determining whether a participant was exposed to a visual or physical balance perturbation while either walking or standing (EEG balance perturbations [65]). Our decoding models all use the HTNet architecture, a compact convolutional neural network that has been demonstrated to perform well at decoding ECoG/EEG data [25,78].…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…We assessed cross-modal, self-supervised decoding performance on four datasets (table 1) and demonstrate in each case that cross-modal decoding outperforms unimodal, self-supervised models and approaches the accuracy of supervised models. We consider three movement decoding tasks: determining whether a participant's arm was moving or at rest (ECoG move/rest [56,77] and EEG move/rest [73]), predicting which of five fingers was being flexed (ECoG finger flexion [40,74]), and determining whether a participant was exposed to a visual or physical balance perturbation while either walking or standing (EEG balance perturbations [65]). Our decoding models all use the HTNet architecture, a compact convolutional neural network that has been demonstrated to perform well at decoding ECoG/EEG data [25,78].…”
Section: Resultsmentioning
confidence: 99%
“…Data streams include electrocorticography (ECoG), electroencephalography (EEG), electromyography (EMG), and multiple kinematic measurements. Kinematic measurements were obtained from markerless motion capture applied to video recordings (ECoG move/rest [56]), exoskeleton positions (EEG move/rest [73]), dataglove recordings (ECoG finger flexion [74]), and motion capture markers (EEG balance perturbations [65]). We computed the number of events per participant after balancing events across classes.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…This analysis also excluded the first 5 s of each short clip within each video block, so that it is aligned with the ISC analysis. In general, employing a machine learning algorithm to track movements of human limbs has been found to be promising in studies of naturalistic human movements ( Peterson et al, 2021 ). For the automatic annotation of movements, we implemented the OpenPose github repository ( https://github.com/CMU-Perceptual-Computing-Lab/openpose ), as in Ntoumanis et al (2022) .…”
Section: Methodsmentioning
confidence: 99%
“…Highly invasive solutions such as intracranial EEG also may not be acceptable or justifiable for many participants. 16,46–48 Subgaleal EEG (sgEEG) allows for an ultra-long-term EEG recording with high stability and data quality. There may be applications beyond epilepsy, such as brain-computer interfaces or gaming.…”
Section: Perspective and Future Workmentioning
confidence: 99%