2020
DOI: 10.48550/arxiv.2012.01074
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Comparison of Attention-based Deep Learning Models for EEG Classification

Abstract: To evaluate the impact on Electroencephalography (EEG) classification of different kinds of attention mechanisms in Deep Learning (DL) models. Methods: We compared three attention-enhanced DL models, the brand-new InstaGATs, an LSTM with attention and a CNN with attention. We used these models to classify normal and abnormal (i.e., artifactual or pathological) EEG patterns. Results: We achieved the state of the art in all classification problems, regardless the large variability of the datasets and the simple … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
2
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(8 citation statements)
references
References 29 publications
0
8
0
Order By: Relevance
“…While most of the works on transformers reverberated the temporal domain, employing the multi-head attention over the spatial ordinate to encompass the inter-region representational similarities may be more beneficial. This idea is further endorsed by literature on other EEG signal classification systems, where the brain connectivity and inter-channel relationships are hinted to be beneficial [22,19].…”
Section: Introductionmentioning
confidence: 91%
See 1 more Smart Citation
“…While most of the works on transformers reverberated the temporal domain, employing the multi-head attention over the spatial ordinate to encompass the inter-region representational similarities may be more beneficial. This idea is further endorsed by literature on other EEG signal classification systems, where the brain connectivity and inter-channel relationships are hinted to be beneficial [22,19].…”
Section: Introductionmentioning
confidence: 91%
“…The deep learning architectures, although newly introduced, have superseded their performances [13,14,15]. Convolutional Neural Networks (CNNs), Deep Belief Networks (DBNs), and Recurrent Neural Networks (RNNs) are the primarily used architectures [16,17], with sparing, but increasing involvement of attention-based models [18,19]. The Transformer Network [20] that employs Multi-Head Attention (MHA), has recently been introduced to EEG systems [21].…”
Section: Introductionmentioning
confidence: 99%
“…In work [17], the author has discussed the difference be-tween machine learning techniques and deep learning method in distinguishing patients under antiepileptic drugs and those taking no medications, as well as between the two anticonvulsants. The method was validated on TUSZ dataset since it is the largest available dataset [5] The comparison invoked in the work [18] shows that a small difference exists between the used ML techniques and deep model in achieving a moderate accuracy rate for medication use detection.…”
Section: B Related Work On Chb-mit and Tusz Datasetsmentioning
confidence: 99%
“…Similarly, if we assume that no attentional state is required during the journey between two consecutive landmarks (i.e., passing by the non-relevant places), then a negative label can be associated to the corresponding EEG data. Thus, we could exploit advanced supervised ML methods to train a binary classifier that could accurately classify the EEG data either as attention-related or non-attention related samples [67,68,69,70,71]. At this stage, several deep architectures can be used to identify the brain networks features which are related to the individual's attentional state.…”
Section: Integration Between Nudge and Neurofeedbackmentioning
confidence: 99%
“…At this stage, several deep architectures can be used to identify the brain networks features which are related to the individual's attentional state. Moreover, by designing and implementating specific "attentional mechanisms", we could weight those pieces of EEG information that mostly impact on determining the classification of the individual's attentional level [67,68].…”
Section: Integration Between Nudge and Neurofeedbackmentioning
confidence: 99%