2021
DOI: 10.3390/s21124222
|View full text |Cite
|
Sign up to set email alerts
|

Assessing Automated Facial Action Unit Detection Systems for Analyzing Cross-Domain Facial Expression Databases

Abstract: In the field of affective computing, achieving accurate automatic detection of facial movements is an important issue, and great progress has already been made. However, a systematic evaluation of systems that now have access to the dynamic facial database remains an unmet need. This study compared the performance of three systems (FaceReader, OpenFace, AFARtoolbox) that detect each facial movement corresponding to an action unit (AU) derived from the Facial Action Coding System. All machines could detect the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
19
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 28 publications
(21 citation statements)
references
References 60 publications
2
19
0
Order By: Relevance
“…OpenFace had the best AU detection performance among the systems evaluated in this study, consistent with the previous findings pertaining to dynamic facial expressions [19]. Among the pre-trained models, OpenFace had the best AU detection performance based on facial expression images.…”
Section: Discussionsupporting
confidence: 90%
See 3 more Smart Citations
“…OpenFace had the best AU detection performance among the systems evaluated in this study, consistent with the previous findings pertaining to dynamic facial expressions [19]. Among the pre-trained models, OpenFace had the best AU detection performance based on facial expression images.…”
Section: Discussionsupporting
confidence: 90%
“…FaceReader provided inaccurate results at 45 • . The poor performance of FaceReader for recognizing dynamic facial expressions reported in a previous study [19] may be due to its vulnerability to angle changes. In this study, the AUC values were significantly lower at 45 • , especially for AUs 1 (inner brow raised), 2 (outer brow raised), 5 (upper lid raised), 25 (lips parted), and 26 (jaw dropped).…”
Section: Discussionmentioning
confidence: 78%
See 2 more Smart Citations
“…Moreover, like several other advanced androids (e.g., Glas et al, 2016;Ishi et al, 2017Ishi et al, , 2019, Nikola has the ability to talk with prosody, which can facilitate multimodal emotional interactions (Paulmann and Pell, 2011). Androids can also utilize advanced artificial intelligence (for reviews, see Krumhuber et al, 2021;Namba et al, 2021) to sense and analyze human facial expressions. We expect that androids will be a valuable tool in future psychological research on human emotional interaction.…”
Section: Discussionmentioning
confidence: 99%