2023
DOI: 10.1016/j.patrec.2023.02.022
|View full text |Cite
|
Sign up to set email alerts
|

Editorial for pattern recognition letters special issue on face-based emotion understanding

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2025
2025

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 19 publications
0
1
0
Order By: Relevance
“…In the context of various applications, Self-Attention has been used to improve model performance. Li et al [23] addressed the problem of poor generalization in emotion recognition models based on EEG by combining Convolutional Networks and Bidirectional LSTM models with Self-Attention. The Self-Attention mechanism helped extract important information by changing the weights of different channels, addressing the significance of different channels and samples when feature extraction is involved.…”
Section: Proposed Methods a Mgts-attentionmentioning
confidence: 99%
“…In the context of various applications, Self-Attention has been used to improve model performance. Li et al [23] addressed the problem of poor generalization in emotion recognition models based on EEG by combining Convolutional Networks and Bidirectional LSTM models with Self-Attention. The Self-Attention mechanism helped extract important information by changing the weights of different channels, addressing the significance of different channels and samples when feature extraction is involved.…”
Section: Proposed Methods a Mgts-attentionmentioning
confidence: 99%