2022
DOI: 10.1007/s12204-022-2488-4
|View full text |Cite
|
Sign up to set email alerts
|

Transfer Learning in Motor Imagery Brain Computer Interface: A Review

Abstract: Sparse-view computed tomography (CT)-using a small number of projections for tomographic reconstructionenables much lower radiation dose to patients and accelerated data acquisition. The reconstructed images, however, suffer from strong artifacts, greatly limiting their diagnostic value. Current trends for sparse-view CT turn to the raw data for better information recovery. The resultant dual-domain methods, nonetheless, suffer from secondary artifacts, especially in ultra-sparse view scenarios, and their gene… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 174 publications
0
3
0
Order By: Relevance
“…In deep learning, the neural network most commonly used for motor imagery classification is the CNN [14], [24], [25], [26], [27]. In the imaging field, AlexNet [43], Visual geometry Group Network (VGGNet) [44], GoogLeNet [45], and Residual NN (ResNet) [46] have been proposed.…”
Section: B Traditional Machine Learning and Deep Learning Approachesmentioning
confidence: 99%
See 2 more Smart Citations
“…In deep learning, the neural network most commonly used for motor imagery classification is the CNN [14], [24], [25], [26], [27]. In the imaging field, AlexNet [43], Visual geometry Group Network (VGGNet) [44], GoogLeNet [45], and Residual NN (ResNet) [46] have been proposed.…”
Section: B Traditional Machine Learning and Deep Learning Approachesmentioning
confidence: 99%
“…We employed CNN, which are often used as classifiers in deep learning for motor imagery [24], [25], [26], [27]. We created CNNs with two different input formats.…”
Section: Deep Learningmentioning
confidence: 99%
See 1 more Smart Citation