2023
DOI: 10.1109/tnsre.2022.3230250
|View full text |Cite
|
Sign up to set email alerts
|

EEG Conformer: Convolutional Transformer for EEG Decoding and Visualization

Abstract: Due to the limited perceptual field, convolutional neural networks (CNN) only extract local temporal features and may fail to capture long-term dependencies for EEG decoding. In this paper, we propose a compact Convolutional Transformer, named EEG Conformer, to encapsulate local and global features in a unified EEG classification framework. Specifically, the convolution module learns the low-level local features throughout the one-dimensional temporal and spatial convolution layers. The self-attention module i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0
6

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
5

Relationship

1
9

Authors

Journals

citations
Cited by 191 publications
(82 citation statements)
references
References 41 publications
0
26
0
6
Order By: Relevance
“…Recently, deep learning (DL) methods such as Convolutional Neural Networks (CNNs) [5], Recurrent Neural Networks (RNNs) [6], and Transformers [7] have received increasing interest as a mean for classifying EEG signals [8]- [10]. At present, in addition to conventional CNNs or RNNs, DL of EEG incorporates many other technologies [11]- [15].…”
Section: Introductionmentioning
confidence: 99%
“…Recently, deep learning (DL) methods such as Convolutional Neural Networks (CNNs) [5], Recurrent Neural Networks (RNNs) [6], and Transformers [7] have received increasing interest as a mean for classifying EEG signals [8]- [10]. At present, in addition to conventional CNNs or RNNs, DL of EEG incorporates many other technologies [11]- [15].…”
Section: Introductionmentioning
confidence: 99%
“…Finally, when the pre-trained model is fitted on UI data with stronger transferability, fewer adjustments are needed to achieve rapid convergence to the ideal UD model. It is still possible that a large-scale unified DNN algorithm can be trained to render a more effective pre-trained model [28], or the linear combination of the representatives can be used to further boost performance [7]. However, the authors still want to underline that it is feasible build an eligible pre-trained model with state-of-the-art algorithms, if inter-data transferability is respected.…”
Section: B Ui Model As Pre-trained Modelmentioning
confidence: 99%
“…Additionally, some applications of attention-based DL networks in MI have been developed in recent years. Song et al [62] proposed EEG Conformer, a convolutional transformer for EEG decoding and visualization. Liu et al [63] proposed TCACNet, a temporal and channel attention convolutional network for MI-EEG classification.…”
Section: Overview Of Existing Mi-bci Approachesmentioning
confidence: 99%