2021
DOI: 10.1109/access.2021.3124028
|View full text |Cite
|
Sign up to set email alerts
|

EEG Mental Recognition Based on RKHS Learning and Source Dictionary Regularized RKHS Subspace Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 36 publications
0
4
0
Order By: Relevance
“…When the dimensionality of the original EEG data is high, the data computational and storage costs will be very large. Thus, a common solution is to project the high-dimensional data into a low-dimensional space (Lei et al, 2021). Let Q ∈ R d×C be the projection matrix, the projection data can be represented as…”
Section: Subspace Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…When the dimensionality of the original EEG data is high, the data computational and storage costs will be very large. Thus, a common solution is to project the high-dimensional data into a low-dimensional space (Lei et al, 2021). Let Q ∈ R d×C be the projection matrix, the projection data can be represented as…”
Section: Subspace Learningmentioning
confidence: 99%
“…Therefore, it is still a challenging task to use machine learning methods to identify AD based on EEG signals. To solve this problem, the researchers usually reduce the dimension of EEG high-dimensional data and extract a small amount of the most valuable compact information, which not only saves storage space and processing time but also enables learning a robust model (Lei et al, 2021). Subspace learning and low-rank representation can well achieve this goal.…”
Section: Introductionmentioning
confidence: 99%
“…This property was leveraged to embed different type of data (Harandi et al, 2014;Brooks et al, 2019b). This motivated the development of different machine learning algorithms (Chevallier et al, 2017;Yair et al, 2019;Zhuang et al, 2020;Lei et al, 2021;Ju and Guan, 2022) and of neural networks architectures (Huang and Van Gool, 2017;Brooks et al, 2019a).…”
Section: Symmetric Positive Definite Matricesmentioning
confidence: 99%
“…How to use label information is always the focus of various domain adaptation algorithms. Lei et al [ 27 ] applied the dictionary learning to the source domain while we borrowed the idea of LDA.…”
Section: Introductionmentioning
confidence: 99%