2020
DOI: 10.1371/journal.pone.0231968
|View full text |Cite
|
Sign up to set email alerts
|

A performance comparison of eight commercially available automatic classifiers for facial affect recognition

Abstract: In the wake of rapid advances in automatic affect analysis, commercial automatic classifiers for facial affect recognition have attracted considerable attention in recent years. While several options now exist to analyze dynamic video data, less is known about the relative performance of these classifiers, in particular when facial expressions are spontaneous rather than posed. In the present work, we tested eight out-of-the-box automatic classifiers, and compared their emotion recognition performance to that … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

12
85
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
4
2

Relationship

2
8

Authors

Journals

citations
Cited by 141 publications
(97 citation statements)
references
References 69 publications
12
85
0
Order By: Relevance
“…For this purpose, we collected data from human observers and conducted automated facial expression analysis with a software tool called FACET (iMotions). FACET has been used widely, thereby demonstrating superior levels of emotion classification in recent cross-classifier comparisons (Stöckli et al, 2018 ; Dupré et al, 2020 ).…”
Section: Introductionmentioning
confidence: 99%
“…For this purpose, we collected data from human observers and conducted automated facial expression analysis with a software tool called FACET (iMotions). FACET has been used widely, thereby demonstrating superior levels of emotion classification in recent cross-classifier comparisons (Stöckli et al, 2018 ; Dupré et al, 2020 ).…”
Section: Introductionmentioning
confidence: 99%
“…This toolkit is one of the most used toolkits that is a cross-platform to recognize multi-face expression in real-time by using the facial action coding system (FACS). It has been showing a high accuracy and reliable system for several application [38][39][40]. The length of time for each recognized emotions was calculated and analyzed to determine if the emotions presented in the robot's voice reflected on the emotional states of users.…”
Section: Experiments Bmentioning
confidence: 99%
“…These findings have direct consequences for the performance of automatic systems that analyze faces for detecting affective states of users. Studies evaluating commercially available software have also revealed challenges for predictions to correspond with self-reported affect [32], as well as the perceptions of third-party observers [25].…”
Section: Introductionmentioning
confidence: 99%