2021
DOI: 10.1016/j.procs.2021.09.115
|View full text |Cite
|
Sign up to set email alerts
|

Students’ emotion extraction and visualization for engagement detection in online learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 45 publications
(20 citation statements)
references
References 10 publications
0
14
0
1
Order By: Relevance
“…The proposed system outperformed other existing engagement detection systems in terms of performance accuracy, as given in Table 7. Our proposed system provided better results than the other approaches used by [28,47,55,69]. Our system used the information from three modalities: facial emotion, eye-blinking, and head movement.…”
Section: Discussionmentioning
confidence: 89%
See 1 more Smart Citation
“…The proposed system outperformed other existing engagement detection systems in terms of performance accuracy, as given in Table 7. Our proposed system provided better results than the other approaches used by [28,47,55,69]. Our system used the information from three modalities: facial emotion, eye-blinking, and head movement.…”
Section: Discussionmentioning
confidence: 89%
“…In [55], authors used emotions for automatic engaged detection with an accuracy value of 70% on the FER-2013 dataset. Authors in [28] used eye-gaze and emotions to visualise emotion detection on the CK+ dataset with an accuracy of 73.4%. Conclusively, we achieved an accuracy rate of 92.58% for engagement state prediction.…”
Section: Discussionmentioning
confidence: 99%
“…They have also been used to determine the concentration levels of students. Hasnine et al [ 10 ] utilized CNNs to detect the concentration level of students. Six types of basic emotions were extracted using a pre-trained CNN, and those were used to detect the concentration level of students in a virtual classroom.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Computer vision-based approaches are used to evaluate both online and offline lecture films to extract the emotions of the students, as emotions are crucial to the learning process. Using a pre-trained Convolutional Neural Network, the six types of basic emotions-angry, disgusted, fearful, happy, sad, surprised, and neutral-are extracted [9]. Natural user interaction in e-learning is the most ongoing task for researchers.…”
Section: Related Workmentioning
confidence: 99%