2021 IEEE Symposium Series on Computational Intelligence (SSCI) 2021
DOI: 10.1109/ssci50451.2021.9660182
|View full text |Cite
|
Sign up to set email alerts
|

Harnessing Unlabeled Data to Improve Generalization of Biometric Gender and Age Classifiers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 23 publications
0
2
0
Order By: Relevance
“…This draws attention to fairness and bias in AI-based facial analytics where unintended consequences from biased systems call for a thorough examination of the datasets and models [18,5,17,8,4]. Most of the published research in this domain suggests low performance for women, and dark-skinned people for facial attribute-based classification systems such as gender and age [8,17,30,24], and face recognition [5,4]. As biased datasets produce biased models, many of the efforts have been focused on developing gender and race-balanced datasets for various facial-analysis based applications.…”
Section: Celebdf Distributionmentioning
confidence: 99%
“…This draws attention to fairness and bias in AI-based facial analytics where unintended consequences from biased systems call for a thorough examination of the datasets and models [18,5,17,8,4]. Most of the published research in this domain suggests low performance for women, and dark-skinned people for facial attribute-based classification systems such as gender and age [8,17,30,24], and face recognition [5,4]. As biased datasets produce biased models, many of the efforts have been focused on developing gender and race-balanced datasets for various facial-analysis based applications.…”
Section: Celebdf Distributionmentioning
confidence: 99%
“…"São Paulo subway ordered to suspend use of facial recognition" [19], this is one among the several media articles published at a well-known press that suggested facial recognition technology is biased across demographics. Apart from media articles, several published academic studies on the bias of face recognition and visual attribute classification algorithms (such as gender-, age-classification and BMI prediction) also suggest performance differences across gender, race, and age [4]- [6], [9], [14], [16], [17], [21], [22], [28]. An unfair (biased) algorithm is one whose decisions are skewed towards a particular group of people.…”
Section: Introductionmentioning
confidence: 99%