2020
DOI: 10.1016/j.ifacol.2021.04.155
|View full text |Cite
|
Sign up to set email alerts
|

Multi-EmoNet: A Novel Multi-Task Neural Network for Driver Emotion Recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 13 publications
(2 citation statements)
references
References 17 publications
0
2
0
Order By: Relevance
“…As shown in Table 5, most of the facialexpression-based studies choose the discrete emotion (six basic emotions and neutral) as the detection targets, and the deep neural network, such as CNN-based models, are commonly used in facial classification, such as GLFCNN [82], Xception [86,87], and basic CNN [85,90]. Equations ( 1) and (2) show the most commonly used loss function (multiclass cross-entropy loss and binary cross-entropy loss) for facial-expression-based emotion detection [82,85,86,88].…”
Section: Facial-expression-basedmentioning
confidence: 99%
“…As shown in Table 5, most of the facialexpression-based studies choose the discrete emotion (six basic emotions and neutral) as the detection targets, and the deep neural network, such as CNN-based models, are commonly used in facial classification, such as GLFCNN [82], Xception [86,87], and basic CNN [85,90]. Equations ( 1) and (2) show the most commonly used loss function (multiclass cross-entropy loss and binary cross-entropy loss) for facial-expression-based emotion detection [82,85,86,88].…”
Section: Facial-expression-basedmentioning
confidence: 99%
“…Emotion recognition is an important part of the interaction between people and machines [1], it has broad application prospects in the fields of distance education [2], psychological therapy [3], assisted driving [4], etc. Now considerable work about emotion recognition has been performed, but there still exists some obstacles toward accurate emotion recognition.…”
Section: Introductionmentioning
confidence: 99%