2023
DOI: 10.1109/tnsre.2022.3220884
|View full text |Cite
|
Sign up to set email alerts
|

Decoding Coordinated Directions of Bimanual Movements From EEG Signals

Abstract: Bimanual coordination is common in human daily life, whereas current research focused mainly on decoding unimanual movement from electroencephalogram (EEG) signals. Here we developed a brain-computer interface (BCI) paradigm of task-oriented bimanual movements to decode coordinated directions from movement-related cortical potentials (MRCPs) of EEG. Eight healthy subjects participated in the target-reaching task, including (1) performing leftward, midward, and rightward bimanual movements, and (2) performing l… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
7
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 11 publications
(11 citation statements)
references
References 48 publications
1
7
0
Order By: Relevance
“…[23] Ofner et al [22] conducted a study where they observed that the averaged signal of the central electrode Cz exhibited the characteristic MRCP pattern during movement attempts. Moreover, they identified discriminative information, such as positive and negative peaks, within MRCPs that could be indicative of the movement class, a finding also supported by the research conducted by Zhang et al [24] This observation suggests that MRCP patterns contain valuable information that can be used to preprocess signals before feeding them into the classifier model, ultimately enhancing its performance. Therefore, in our proposed methodology, we partitioned the entire dataset into a training set, consisting of 80% of the data, and a test set, comprising the remaining 20% of the data.…”
Section: Mrcpsupporting
confidence: 58%
See 1 more Smart Citation
“…[23] Ofner et al [22] conducted a study where they observed that the averaged signal of the central electrode Cz exhibited the characteristic MRCP pattern during movement attempts. Moreover, they identified discriminative information, such as positive and negative peaks, within MRCPs that could be indicative of the movement class, a finding also supported by the research conducted by Zhang et al [24] This observation suggests that MRCP patterns contain valuable information that can be used to preprocess signals before feeding them into the classifier model, ultimately enhancing its performance. Therefore, in our proposed methodology, we partitioned the entire dataset into a training set, consisting of 80% of the data, and a test set, comprising the remaining 20% of the data.…”
Section: Mrcpsupporting
confidence: 58%
“…The extraction of MRCPs was conducted to investigate their discriminative patterns, as highlighted by Ofner et al [ 22 ] However, accurately estimating these patterns needs a substantial dataset. On the other hand, Zhang et al [ 24 ] performed a statistical analysis, revealing significant differences in negative peak amplitude across certain motions, while no such distinction was observed for positive peak amplitude. Consequently, an additional processing step is imperative to enhance the performance of the classification model.…”
Section: Resultsmentioning
confidence: 99%
“…There are also several studies exploring bimanual movement decoding. In [22], 7-class reach-and-grasp unimanual and bimanual movements' decoding accuracy peaked at 38.6% with the time window of [0.1, 1.1] s. In [17], 6-class unimanual and bimanual movement directions' decoding accuracy peaked at 69.02% with the time window of [0.3, 1.3] s. In [23], 3-class bimanual coordinated directions decoding accuracy peaked at 70.1% using a window of [0, 1] s. Though using different datasets, our proposed method showed superior performance in the unimanual and bimanual movement decoding. To further validate the effectiveness of our proposed method, we also compared with several state-of-art models on our dataset.…”
Section: Discussionmentioning
confidence: 99%
“…Several studies [17] [22] [23] have strode the first step toward decoding the unimanual and bimanual movements. In [22], Schwarz et al discriminated 7-class reach-and-grasp movements using the EEG time-domain features, and averaged classification accuracies reached 38.6% for a combination of six movements and one rest condition.…”
mentioning
confidence: 99%
See 1 more Smart Citation