2021
DOI: 10.3389/fnhum.2020.605246
|View full text |Cite
|
Sign up to set email alerts
|

Two-Level Domain Adaptation Neural Network for EEG-Based Emotion Recognition

Abstract: Emotion recognition plays an important part in human-computer interaction (HCI). Currently, the main challenge in electroencephalogram (EEG)-based emotion recognition is the non-stationarity of EEG signals, which causes performance of the trained model decreasing over time. In this paper, we propose a two-level domain adaptation neural network (TDANN) to construct a transfer model for EEG-based emotion recognition. Specifically, deep features from the topological graph, which preserve topological information f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
23
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1
1

Relationship

2
6

Authors

Journals

citations
Cited by 43 publications
(24 citation statements)
references
References 41 publications
1
23
0
Order By: Relevance
“…The linear dynamic system method was used to filter the noises and artifacts that were unrelated to EEG features ( Shi and Lu, 2010 ). Next, the DE features were transformed into 32*32*5 (length and width = 32, channel = 5) topology images according to the method in the literature ( Bao et al, 2021 ), namely TP-DE, as shown in Figure 2 .…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The linear dynamic system method was used to filter the noises and artifacts that were unrelated to EEG features ( Shi and Lu, 2010 ). Next, the DE features were transformed into 32*32*5 (length and width = 32, channel = 5) topology images according to the method in the literature ( Bao et al, 2021 ), namely TP-DE, as shown in Figure 2 .…”
Section: Methodsmentioning
confidence: 99%
“…In our previous work ( Bao et al, 2021 ), we proposed a deep neural network (DNN) classification model that can effectively extract the features of the topological graph; its structure is shown in Figure 3 . We add the AdaBN layer ( Li et al, 2018 ) after each convolution layer and full connection layer, which standardizes the distribution between the real samples and the generated samples in each batch.…”
Section: Methodsmentioning
confidence: 99%
“…Common approaches include domainadversarial neural networks [19] and approaches using the maximum mean discrepancy (MMD) loss [20,21]. Unsupervised domain adaptation techniques have already demonstrated their potential for EEG-based applications such as emotion recognition and braincomputer interfaces [22,23,24,25,26]. These domain adaptation techniques have mainly focused on personalization, cross-session adaptation and domain mismatch between different datasets, with the same or very similar channels recorded in both datasets.…”
Section: Introductionmentioning
confidence: 99%
“…For EEG-based domain adaption, current studies focus on emotion recognition, cognitive load recognition, movement recognition, and motor imagery decoding. Li et al and Bao et al investigated multi-source transfer learning and two-level domain adaptation neural networks, respectively, for cross-subject EEG emotion recognition (Li et al, 2019 ; Bao et al, 2021 ). Jimenez-Guarneros et al proposed custom domain adaptation for cross-subject cognitive load recognition (Jimenez-Guarneros and Gomez-Gil, 2020 ).…”
Section: Introductionmentioning
confidence: 99%
“…For EEGbased domain adaption, current studies focus on emotion recognition, cognitive load recognition, movement recognition, and motor imagery decoding. Li et al and Bao et al investigated multi-source transfer learning and two-level domain adaptation neural networks, respectively, for cross-subject EEG emotion recognition (Li et al, 2019;Bao et al, 2021 (Tang and Zhang, 2020). However, there is little research on the task of EEG-based target detection, which might be caused by the difficulty of training on an imbalanced dataset of EEG-based target detection.…”
Section: Introductionmentioning
confidence: 99%