2020
DOI: 10.1007/978-3-030-58580-8_43
|View full text |Cite
|
Sign up to set email alerts
|

REVISE: A Tool for Measuring and Mitigating Bias in Visual Datasets

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
67
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 69 publications
(67 citation statements)
references
References 44 publications
0
67
0
Order By: Relevance
“…For example, if all the images labelled as 'CEO' are of white men, the model will not associate women or men of colour with that particular term. Therefore, having diversity in training datasets is imperative to countering and mitigating implicit biases in machine learning models [3,13]. Building on this work we approach the issue of bias in facial recognition datasets from the perspective of geography and devise measures that evaluate and accordingly increase diversity.…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…For example, if all the images labelled as 'CEO' are of white men, the model will not associate women or men of colour with that particular term. Therefore, having diversity in training datasets is imperative to countering and mitigating implicit biases in machine learning models [3,13]. Building on this work we approach the issue of bias in facial recognition datasets from the perspective of geography and devise measures that evaluate and accordingly increase diversity.…”
Section: Related Workmentioning
confidence: 99%
“…As a result, they have a high representation of attributes associated with Western societies such as faces with lighter skin tone and western clothing, leaving the datasets heavily biased with an underrepresentation of non-western regions such as Africa or West Asia. Wang et al [13] studied the geographical distribution of images in OpenImages and ImageNet and found them to be Europe and North America centric, with the USA being highly over-represented and Africa being severely under-represented. When these datasets are used to train deep learning models such biases can be propagated within the learning models and amplified within AI systems [3].…”
Section: Biases In Visual Datasetsmentioning
confidence: 99%
See 3 more Smart Citations