2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) 2020
DOI: 10.1109/cvprw50498.2020.00022
|View full text |Cite
|
Sign up to set email alerts
|

Bias in Multimodal AI: Testbed for Fair Automatic Recruitment

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
34
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
7
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 38 publications
(36 citation statements)
references
References 17 publications
2
34
0
Order By: Relevance
“…However, our experiment also reveals many unconventional results that future work have to address. Our findings strongly motivate further advances in making recognition systems more robust against covariates [44], [28], explainable [5], [50], and fair [60], [51]. We hope that these findings help to develop robust and bias-mitigating face recognition solutions and help also to move forward biasaware and bias-mitigating technology in other AI application areas.…”
Section: Introductionsupporting
confidence: 52%
“…However, our experiment also reveals many unconventional results that future work have to address. Our findings strongly motivate further advances in making recognition systems more robust against covariates [44], [28], explainable [5], [50], and fair [60], [51]. We hope that these findings help to develop robust and bias-mitigating face recognition solutions and help also to move forward biasaware and bias-mitigating technology in other AI application areas.…”
Section: Introductionsupporting
confidence: 52%
“…However, nowadays, the use of algorithms during the recruitment process is increasing. For example, there is a growing trend to use artificial intelligence algorithms to scan resumes [30] and video-interviews [31]. Moreover, personnel selection processes also involve the use of algorithms during the earliest stages of the selection process, that is, even before the short-listing phase.…”
Section: Gender Biases In Humans Through Job Platformsmentioning
confidence: 99%
“…In sum, most of the field experiments based on the correspondence testing procedure have focused on the influence of human gender biases during the hiring process. However, the use of algorithms and job platforms in human resources departments are increasing (e.g [30]). Typically, algorithms and job platforms are broadly used during the initial personalized automated-job alerts phase (e.g.…”
Section: Gender Biases Using Algorithms For Recruitmentmentioning
confidence: 99%
“…There are also many instances where blind use of algorithmic models and machine learning methods have failed us. The Google Flu Trends (Butler, 2013), IBM's Watson supercomputer cancer treatment recommendations (Ross and Swetlitz, 2018), and Apple Card Algorithm (Vigdor, 2019;Pena et al, 2020) are just some of well-publicized examples. We do not think that these problems argue against either type of models; rather their co-existence and further promotion of their better use make statistics a better playground.…”
Section: Data Models and Algorithmic Models Are No Longer Easily Distinguishablementioning
confidence: 99%