2022
DOI: 10.1007/s00034-022-02164-7
|View full text |Cite
|
Sign up to set email alerts
|

An EEG-Based Thought Recognition Using Pseudo-Wigner–Kullback–Leibler Deep Neural Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 19 publications
0
1
0
Order By: Relevance
“…Meanwhile, the classification results obtained through multi-domain feature fusion techniques have excellent performance for classification, such as time-frequency domain fusion [19], space-frequency domain fusion [20], and time-frequency-space domain methods [21]. The development of Deep Learning (DL), Convolutional Neural Networks (CNN), Deep Belief Networks (DBN), and Recurrent Neural Networks (RNN) have also been applied to classification tasks in MI with good results [22]. However, compared to traditional methods, the results of deep learning depend on the quality of EEG samples, the complexity of operations reduces the decoding efficiency of MI-BCI, and the optimization process of the model greatly increases the training time of the MI decoding algorithm.…”
Section: Introductionmentioning
confidence: 99%
“…Meanwhile, the classification results obtained through multi-domain feature fusion techniques have excellent performance for classification, such as time-frequency domain fusion [19], space-frequency domain fusion [20], and time-frequency-space domain methods [21]. The development of Deep Learning (DL), Convolutional Neural Networks (CNN), Deep Belief Networks (DBN), and Recurrent Neural Networks (RNN) have also been applied to classification tasks in MI with good results [22]. However, compared to traditional methods, the results of deep learning depend on the quality of EEG samples, the complexity of operations reduces the decoding efficiency of MI-BCI, and the optimization process of the model greatly increases the training time of the MI decoding algorithm.…”
Section: Introductionmentioning
confidence: 99%