2022
DOI: 10.1109/taffc.2022.3188390
|View full text |Cite
|
Sign up to set email alerts
|

Classifying Emotions and Engagement in Online Learning Based on a Single Facial Expression Recognition Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
51
0
6

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 163 publications
(57 citation statements)
references
References 52 publications
0
51
0
6
Order By: Relevance
“…The current state of the art for emotion recognition of Affectnet dataset is [14], proposed face detection, tracking, and clustering techniques which are applied to extract the sequences of faces from each frame. Next, a single efficient neural network is used to extract emotional features in each frame.…”
Section: Related Workmentioning
confidence: 99%
“…The current state of the art for emotion recognition of Affectnet dataset is [14], proposed face detection, tracking, and clustering techniques which are applied to extract the sequences of faces from each frame. Next, a single efficient neural network is used to extract emotional features in each frame.…”
Section: Related Workmentioning
confidence: 99%
“…We follow [51] for 8 emotion classification, where emotion label is given for each video frame. We further apply sliding window smoothing algorithm [52] on the temporal domain to smooth the distribution of emotion along time.…”
Section: A4 Selected Algorithmsmentioning
confidence: 99%
“…This module allows us to combine the results of several statistical functions (such as minimum, maximum, mean value, and standard deviation) calculated component by component when using the features of all the video frames. As a result, we obtain a descriptor, which contains information about all the input frames [23]. We apply the L 2 -norm to the descriptors and use the result to train the SVM with linear kernel that predicts one of the classes of emotions.…”
Section: Proposed Approachmentioning
confidence: 99%