Proceedings of the 2021 AAAI/ACM Conference on AI, Ethics, and Society 2021
DOI: 10.1145/3461702.3462609
|View full text |Cite
|
Sign up to set email alerts
|

Age Bias in Emotion Detection: An Analysis of Facial Emotion Recognition Performance on Young, Middle-Aged, and Older Adults

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
20
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3
3

Relationship

0
9

Authors

Journals

citations
Cited by 51 publications
(20 citation statements)
references
References 12 publications
0
20
0
Order By: Relevance
“…The results showed that age estimation generally performed poorly on older age groups (60 +), an effect which was compounded by gender and race; the age estimation worked disappointingly on older women of colour. Recently, another study showed that, when evaluating systems for facial emotion recognition (FER) using various classification performance metrics, the state-of-the-art commercial systems performed the best when recognizing emotions in younger adults (aged 19-31), and worst for the oldest age group (61-80) (Kim et al 2021).…”
Section: Age Bias In Algorithms and Digital Datasets (Technical Level)mentioning
confidence: 99%
“…The results showed that age estimation generally performed poorly on older age groups (60 +), an effect which was compounded by gender and race; the age estimation worked disappointingly on older women of colour. Recently, another study showed that, when evaluating systems for facial emotion recognition (FER) using various classification performance metrics, the state-of-the-art commercial systems performed the best when recognizing emotions in younger adults (aged 19-31), and worst for the oldest age group (61-80) (Kim et al 2021).…”
Section: Age Bias In Algorithms and Digital Datasets (Technical Level)mentioning
confidence: 99%
“…Wilson et al [71] finds that state-of-the-art object detection systems also fail for people with darker skin. Rhue [53] observes that emotion detection systems are more likely to ascribe negative emotions to Black individuals, while Kim et al [31] find that emotion detection systems fail to generalize for images of older adults. In accordance with this finding, Park et al [45] show that computer vision datasets systematically underrepresent older adults.…”
Section: Impact Of Training Datamentioning
confidence: 99%
“…Bias in AI. Social biases related to gender [7], age [40], religion [1], and sexuality [68] have been observed in AI systems. Our review of the extensive related work in this area focuses on racial bias in AI and on biases observed in CLIP.…”
Section: Related Workmentioning
confidence: 99%