2020
DOI: 10.1016/j.imavis.2020.103954
|View full text |Cite
|
Sign up to set email alerts
|

Investigating bias in deep face analysis: The KANFace dataset and empirical study

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
21
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
2

Relationship

2
7

Authors

Journals

citations
Cited by 33 publications
(22 citation statements)
references
References 16 publications
1
21
0
Order By: Relevance
“…• We also demonstrate that for age estimation, different facial parts have variable importance with "nose" being the least important region; • Our FP-Age achieves new state-of-the-art results on IMDB-Clean, Morph [5] and CACD [6]; • When trained on IMDB-Clean, our FP-Age also achieves state-of-the-art results on KANFace [7], FG-Net [8], Morph [5] and CACD [6] under cross-dataset evaluation.…”
mentioning
confidence: 67%
“…• We also demonstrate that for age estimation, different facial parts have variable importance with "nose" being the least important region; • Our FP-Age achieves new state-of-the-art results on IMDB-Clean, Morph [5] and CACD [6]; • When trained on IMDB-Clean, our FP-Age also achieves state-of-the-art results on KANFace [7], FG-Net [8], Morph [5] and CACD [6] under cross-dataset evaluation.…”
mentioning
confidence: 67%
“…More recent studies [26], [39], [65], [12], [36], [9], [34], [57], [29], [6] focused on jointly investigating the effects of user demographics on face recognition. These studies showed that the effects lead to an exponential face recognition error increase when facing the same biased race, gender, and age factors [34].…”
Section: A Estimating Bias In Face Recognitionmentioning
confidence: 99%
“…Georgopoulos et al [11] tested different deep learning based-face technology models such as facial recognition and gender prediction to find whether there is a bias in those systems or not. Moreover, the ten coding schemes method was used in [12] to analyze the diversity of subjects' faces in their proposed KANface dataset.…”
Section: Investigating Bias In Facial Data-based Technologymentioning
confidence: 99%
“…A method based on separating non-discriminative demographic features from the network embedding was proposed in [11] to mitigate bias resulted from unbalanced training data. However, it was useful only with bias toward age when applying face and gender recognition and it failed to remove gender bias for the age estimation and face recognition tasks.…”
Section: Mitigating Bias By Removing Demographic Featuresmentioning
confidence: 99%