2021
DOI: 10.1111/ced.14726
|View full text |Cite
|
Sign up to set email alerts
|

Unintentional consequences of artificial intelligence in dermatology for patients with skin of colour

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(5 citation statements)
references
References 3 publications
0
5
0
Order By: Relevance
“…If the models we use are racially biased, this could exacerbate the current racial divide in the US. 14 Similar research shows poor accuracy or racial bias for AI in population health, 3,6,15,16 dermatology, 15 heart failure, 16 opioid use, 17 kidney function, 18 speech recognition, 19 gender classification, 20 and many others. 21 There are very few validations of AI in general.…”
Section: Implementation Of Ai Results Can Be Racially Biasedmentioning
confidence: 93%
See 2 more Smart Citations
“…If the models we use are racially biased, this could exacerbate the current racial divide in the US. 14 Similar research shows poor accuracy or racial bias for AI in population health, 3,6,15,16 dermatology, 15 heart failure, 16 opioid use, 17 kidney function, 18 speech recognition, 19 gender classification, 20 and many others. 21 There are very few validations of AI in general.…”
Section: Implementation Of Ai Results Can Be Racially Biasedmentioning
confidence: 93%
“…Diagnostic AI, such as with skin recognition, has also been shown to be biased. 15 It is easy to imagine how an algorithm focused on clinical decision support for a dermatological disease might worsen health disparity. For example, if the prediction algorithm is trained with a predominantly Caucasian population in diagnosing skin cancer, it could lead to poor accuracy in Black or Brown populations.…”
Section: Implementation Of Ai Results Can Be Racially Biasedmentioning
confidence: 99%
See 1 more Smart Citation
“…10 Although AI-based methods work well in assessing blood-soaked tissues, they are likely to underperform in assessing photographs of anatomical sites, as AI algorithms are known to be biased toward lighter skin tones. 11 There are several limitations to our observations. First, our results are specific to the bleeding scale used and might be different with other bleeding definitions.…”
Section: Discussionmentioning
confidence: 89%
“…The (re)production of racism, as well as racist thinking errors through the use of AI technology, is discussed in many places (Butt et al 2021;Langmia 2021;Yen and Hung 2021), but the arguments do not seem to want to be heard. The lack of addressing dominance and difference relations at the level of AI is not without repercussions, as it enables these relations to persist unquestioned and solidify within a novel normative framework.…”
Section: Anomic Results Of Democratically Oriented Normsmentioning
confidence: 99%