2013
DOI: 10.3758/s13428-013-0421-3
|View full text |Cite
|
Sign up to set email alerts
|

Psychometric challenges and proposed solutions when scoring facial emotion expression codes

Abstract: Coding of facial emotion expressions is increasingly performed by automated emotion expression scoring software; however, there is limited discussion on how best to score the resulting codes. We present a discussion of facial emotion expression theories and a review of contemporary emotion expression coding methodology. We highlight methodological challenges pertinent to scoring software-coded facial emotion expression codes and present important psychometric research questions centered on comparing competing … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
24
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 31 publications
(25 citation statements)
references
References 64 publications
(61 reference statements)
1
24
0
Order By: Relevance
“…Nevertheless, some expressions, especially, happiness, and also disgust and surprise, were classified better than sadness, anger, and fear (see Table 2 ), which is in total agreement with results obtained with other automated computation algorithms ( Lucey et al, 2010 ). Second, AUs generally discriminated between expressive categories, and this was in accordance with FACS proposals ( Ekman et al, 2002 ; Olderbak et al, 2014 ). Some AUs characterized expressions more specifically or strongly than others (see Table 3 ), e.g., AU12 for happiness, AU25 for surprise, AUs 9 and 10 for disgust, AU1 for fear, and AU4 for anger and sadness (the AU4 combination with other AUs allowed for a clear discrimination between these two expressions; see Table 2 ).…”
Section: Discussionsupporting
confidence: 85%
See 2 more Smart Citations
“…Nevertheless, some expressions, especially, happiness, and also disgust and surprise, were classified better than sadness, anger, and fear (see Table 2 ), which is in total agreement with results obtained with other automated computation algorithms ( Lucey et al, 2010 ). Second, AUs generally discriminated between expressive categories, and this was in accordance with FACS proposals ( Ekman et al, 2002 ; Olderbak et al, 2014 ). Some AUs characterized expressions more specifically or strongly than others (see Table 3 ), e.g., AU12 for happiness, AU25 for surprise, AUs 9 and 10 for disgust, AU1 for fear, and AU4 for anger and sadness (the AU4 combination with other AUs allowed for a clear discrimination between these two expressions; see Table 2 ).…”
Section: Discussionsupporting
confidence: 85%
“…Recently, FACET has been used in psychological and applied research (see Dente et al, 2017 ). The automated analysis provides two types of measures (see Gordon et al, 2011 ; Olderbak et al, 2014 ): (a) expression evidence scores for each category: joy, anger, surprise, fear, disgust, sadness, and contempt, in addition to neutral; and (b) AUs evidence scores (for 20 AUs: 1, 2, 4, 5, 6, 7, 9, 10, 12, 14, 15, 17, 18, 20, 23, 24, 25, 26, 28, and 43), according to FACS ( Ekman et al, 2002 ); see also ( Cohn et al, 2007 ; Cohn and De la Torre, 2015 ). AUs are anatomically related to the movement of specific face muscles (e.g., AU12 involves the contraction of the zygomaticus major muscle, which draws the angle of the mouth superiorly and posteriorly to allow for smiling).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…This technology has potential applications in research involving PTSD among both children and adults. Previous studies using emotional facial expression software were limited to adults ( Samal and Iyengar, 1992 ; Zeng et al, 2009 ; Olderbak et al, 2014 ). Prior studies have not used the software with child study participants.…”
Section: Discussionmentioning
confidence: 99%
“…J. Wilson, Smyth, & MacLean, 2014). The development of robust mobile eye trackers (e.g., Applied Science Laboratories’ Mobile Eye system), the emergence of commercial software for automated facial analytics (e.g., from Affectiva, Emotient, and Noldus; Olderbak, Hildebrandt, Pinkpank, Sommer, & Wilhelm, 2014), and the widespread dissemination of smart phone technology afford additional opportunities for objectively and unobtrusively quantifying social attention, context, and daily behavior (Gosling & Mason, 2015; Sano et al, 2015; Wrzus & Mehl, 2015). Combining these measures with laboratory assays of brain function would open the door to discovering the neural systems underlying maladaptive experiences and pathology-promoting behaviors (e.g., social withdrawal, avoidance, and hyper-vigilance) in the real world, close to clinical end-point (Price et al, 2016).…”
Section: Future Challengesmentioning
confidence: 99%