2020
DOI: 10.48550/arxiv.2007.10075
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Investigating Bias and Fairness in Facial Expression Recognition

Abstract: Recognition of expressions of emotions and affect from facial images is a well-studied research problem in the fields of affective computing and computer vision with a large number of datasets available containing facial images and corresponding expression labels. However, virtually none of these datasets have been acquired with consideration of fair distribution across the human population. Therefore, in this work, we undertake a systematic investigation of bias and fairness in facial expression recognition b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 45 publications
(81 reference statements)
0
2
0
Order By: Relevance
“…More specific to faces, Drozdowski et al summarizes that the cohorts of concern in biometrics are demographic (e.g., sex, age, and race), person-specific (e.g., pose or expression [46], and accessories like eye-wear or makeup), and environmental (e.g., camera-model, sensor size, illumination, occlusion) [5]. Albiero et al found empirical support that having training data that is well balanced in gender does not mean that results of a genderbalanced test set will be balanced [47].…”
Section: Imbalanced Data and Data Problems In Frmentioning
confidence: 99%
“…More specific to faces, Drozdowski et al summarizes that the cohorts of concern in biometrics are demographic (e.g., sex, age, and race), person-specific (e.g., pose or expression [46], and accessories like eye-wear or makeup), and environmental (e.g., camera-model, sensor size, illumination, occlusion) [5]. Albiero et al found empirical support that having training data that is well balanced in gender does not mean that results of a genderbalanced test set will be balanced [47].…”
Section: Imbalanced Data and Data Problems In Frmentioning
confidence: 99%
“…In this survey, human bias, as opposed to machine learning algorithm, was the topic of investigation. More directly related to our research field is Xu et al (2020). In their research, the authors investigate bias in facial expression recognition, albeit without the consideration of Action Units.…”
Section: Biasmentioning
confidence: 99%