2022
DOI: 10.1109/taffc.2022.3199075
|View full text |Cite
|
Sign up to set email alerts
|

A Dual-Branch Dynamic Graph Convolution Based Adaptive TransFormer Feature Fusion Network for EEG Emotion Recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
14
0
2

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 51 publications
(17 citation statements)
references
References 36 publications
1
14
0
2
Order By: Relevance
“…Despite most of the surveyed papers being relatively recent, a wide range of GNN-based methods has already been proposed to classify EEG signals in a diverse set of tasks, such as emotion recognition, brain-computer interfaces, and psychological and neurodegenerative disorders and diseases [46], [53], [54], [56], [58], [61], [70], [72], [75], [83], [89], [106] Chebyshev Graph Convolution ✗ ✓ ✗ [49], [51], [55], [57], [59], [66], [67], [69], [71], [74], [76]- [78], [80], [82], [85], [90], [97], [99], [104] Graph Attention Network ✓ ✗ ✗ [60], [62], [73], [84], [88], [94], [98] This survey categorises the proposed GNN models in terms of their inputs and modules. Specifically, these are brain graph structure, node features and their preprocessing, GCN layers, node pooling mechanisms, and formation of graph embeddings.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…Despite most of the surveyed papers being relatively recent, a wide range of GNN-based methods has already been proposed to classify EEG signals in a diverse set of tasks, such as emotion recognition, brain-computer interfaces, and psychological and neurodegenerative disorders and diseases [46], [53], [54], [56], [58], [61], [70], [72], [75], [83], [89], [106] Chebyshev Graph Convolution ✗ ✓ ✗ [49], [51], [55], [57], [59], [66], [67], [69], [71], [74], [76]- [78], [80], [82], [85], [90], [97], [99], [104] Graph Attention Network ✓ ✗ ✗ [60], [62], [73], [84], [88], [94], [98] This survey categorises the proposed GNN models in terms of their inputs and modules. Specifically, these are brain graph structure, node features and their preprocessing, GCN layers, node pooling mechanisms, and formation of graph embeddings.…”
Section: Discussionmentioning
confidence: 99%
“…These methods can be generally categorised as learnable or pre-defined. Multiple/Combined graph definitions -- [47], [49], [53], [54], [57]- [59], [61]- [64], [67], [69], [72], [79], [79], [81], [82], [87], [92], [102] [51], [53], [55], [57], [71], [72], [75], [78], [81], [82], [87], [89], [90], [92], [93], [93], [95]- [99], [101], [102] Raw signal ✓ ✗ ✗…”
Section: Definition Of Brain Graph Structurementioning
confidence: 99%
See 2 more Smart Citations
“…Moreover, to further uncover the relationships between EEG channels, dynamic graph convolution has gained significant traction [16]. Dynamic graph neural networks employ a learnable adjacency matrix as a parameter, which is updated during the training process [17,18,19].…”
Section: Dynamic Graph Convolutional Network For Bcismentioning
confidence: 99%