2023
DOI: 10.3390/s23031404
|View full text |Cite
|
Sign up to set email alerts
|

Fusion Graph Representation of EEG for Emotion Recognition

Abstract: Various relations existing in Electroencephalogram (EEG) data are significant for EEG feature representation. Thus, studies on the graph-based method focus on extracting relevancy between EEG channels. The shortcoming of existing graph studies is that they only consider a single relationship of EEG electrodes, which results an incomprehensive representation of EEG data and relatively low accuracy of emotion recognition. In this paper, we propose a fusion graph convolutional network (FGCN) to extract various re… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 19 publications
(4 citation statements)
references
References 33 publications
0
4
0
Order By: Relevance
“…This clearly demonstrates the significant effect of using shorter time windows and focusing on the latest data during the emotion recognition process, showcasing the feasibility of real-time EEG emotion recognition. However, the experimental results are not as good as DGCNN [ 26 ], GLEM [ 27 ], BODF [ 28 ], and FGCN [ 29 ]. This may be due to other trade-offs made in achieving real-time performance, resulting in some data loss in other aspects.…”
Section: Resultsmentioning
confidence: 99%
“…This clearly demonstrates the significant effect of using shorter time windows and focusing on the latest data during the emotion recognition process, showcasing the feasibility of real-time EEG emotion recognition. However, the experimental results are not as good as DGCNN [ 26 ], GLEM [ 27 ], BODF [ 28 ], and FGCN [ 29 ]. This may be due to other trade-offs made in achieving real-time performance, resulting in some data loss in other aspects.…”
Section: Resultsmentioning
confidence: 99%
“…In reference [46], Li et al propose a fusion graph convolutional network (FGCN) architecture for extracting and combining various relationships in EEG data to obtain a more comprehensive representation for emotion recognition. FGCN initially identifies brain connectivity features based on topology, causality, and function.…”
Section: Related Studiesmentioning
confidence: 99%
“…With the aid of X l 1 and X l 2 , we undertake the computation of the adjacency matrix, which serves to represent the connectivity between various EEG channels or nodes within a graph-based brain network. Here, we have opted for four commonly employed functional connectivity methods, namely correlation (Cor), coherence (Coh), phase locking value (PLV), and phase lag index (PLI), all of which have shown promising results in EEG-based emotion recognition [48][49][50][51].…”
Section: Task-specific Adjacency Matrices Constructionmentioning
confidence: 99%