2017
DOI: 10.1016/j.image.2017.08.012
|View full text |Cite
|
Sign up to set email alerts
|

BNU-LSVED 2.0: Spontaneous multimodal student affect database with multi-dimensional labels

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
16
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 31 publications
(16 citation statements)
references
References 38 publications
0
16
0
Order By: Relevance
“…In developing the FER model for the learning environment, it is essential to identify meaningful academic affective states. Academic affective states are part of specific affective states that cannot be represented by basic affective states (Wei, Q., et al 2017). Some related works in the literature have reported the relevant students' facial expressions, which help in determining their academic affective states.…”
Section: Proposed Methodologymentioning
confidence: 99%
See 1 more Smart Citation
“…In developing the FER model for the learning environment, it is essential to identify meaningful academic affective states. Academic affective states are part of specific affective states that cannot be represented by basic affective states (Wei, Q., et al 2017). Some related works in the literature have reported the relevant students' facial expressions, which help in determining their academic affective states.…”
Section: Proposed Methodologymentioning
confidence: 99%
“…This paper addresses this research gap by exploring an automatic real‐time system to monitor student group engagement by analysing their facial expressions and recognizing academic affective states. Academic affective states are part of specific affective states that cannot be represented by basic emotions (Wei, Q., et al 2017). For this study six meaningful academic affective states are used, namely: ‘boredom,’ ‘confuse,’ ‘focus,’ ‘frustrated,’ ‘yawning,’ and ‘sleepy,’ pertinent in the learning environment (D'Mello, S. 2013; Tonguç & Ozkara, 2020).…”
Section: Introductionmentioning
confidence: 99%
“…S1∼S12 are the 12 items of the scale, after normalization, the value range of each dimension of PAD is [−1, +1], with +P, +A, +D and −P, −A, −D to express various combinations of pleasure, excitement, and dominance. In this study, 112 groups of image sequences with clear and obvious emotional communication were selected from the BNU_LSVED dataset [24], and six researchers were invited to score according to the above emotional scale. After the scoring is completed, the score of each image sequence is normalized according to formula (1) to obtain the PAD value, and the PAD values marked by different emotions are counted according to the emotion category, and the statistical results are summarized and averaged.…”
Section: Data Quantification For Emotion Detectionmentioning
confidence: 99%
“…The lack of accurate assessment of emotional experiences in the participants can jeopardise experiments such as this [22]. To better identify the academic emotions, the participants and external coders are invited to undertake the task in this experiment.…”
Section: Creation Of the Databasementioning
confidence: 99%