2022
DOI: 10.1109/tnsre.2022.3194600
|View full text |Cite
|
Sign up to set email alerts
|

A Transformer-Based Approach Combining Deep Learning Network and Spatial-Temporal Information for Raw EEG Classification

Abstract: The attention mechanism of the Transformer has the advantage of extracting feature correlation in the long-sequence data and visualizing the model. As time-series data, the spatial and temporal dependencies of the EEG signals between the time points and the different channels contain important information for accurate classification. So far, Transformer-based approaches have not been widely explored in motor-imagery EEG classification and visualization, especially lacking general models based on cross-individu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
28
0
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 126 publications
(41 citation statements)
references
References 59 publications
0
28
0
1
Order By: Relevance
“…Results of the experiments show that the proposed work provides a trustworthy system with a high average accuracy of 94.4% and a training time of 0.6 s for the MAHNOB-HCI database. The Transformer's feature correlation extraction and display method uses long-sequence data [19]. For precise classification, EEG data across time points and channels must exhibit spatial and temporal correlations.…”
Section: In Depth Review Of Existing Eeg Processing Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…Results of the experiments show that the proposed work provides a trustworthy system with a high average accuracy of 94.4% and a training time of 0.6 s for the MAHNOB-HCI database. The Transformer's feature correlation extraction and display method uses long-sequence data [19]. For precise classification, EEG data across time points and channels must exhibit spatial and temporal correlations.…”
Section: In Depth Review Of Existing Eeg Processing Modelsmentioning
confidence: 99%
“…Work in [18] developed a therapy for the prevention and treatment of epilepsy by using a solution that was based on an in-depth learning technique that was constructed on a cloud platform. A deep network-based coding technique was developed for the analysis of epileptic EEG data by both [19] and [20]. While the bulk of these studies have concentrated on regular data.…”
Section: Introductionmentioning
confidence: 99%
“…Lately, attention-based Transformer models have made waves in natural language and image processing due to the inherent perception of global dependencies [20]. Transformers also emerge in EEG decoding and achieve good performance, by leveraging long-term temporal relationships [21], [22]. However, such models ignore learning local features, which are also necessary for EEG decoding.…”
Section: Introductionmentioning
confidence: 99%
“…The solutions obtained by the filter coefficients are dependent on the initial parameters (9,10) . Transformed based approaches have been used to classify the EEG signals with the spatial-temporal characteristics of EEG, with good percentage of accuracy but there is change of improving the accuracy even more (11) .…”
Section: Introductionmentioning
confidence: 99%