2020
DOI: 10.1093/jamia/ocaa210
|View full text |Cite
|
Sign up to set email alerts
|

Bias at warp speed: how AI may contribute to the disparities gap in the time of COVID-19

Abstract: The COVID-19 pandemic is presenting a disproportionate impact on minorities in terms of infection rate, hospitalizations and mortality. Many believe Artificial Intelligence (AI) is a solution to guide clinical decision making for this novel disease, resulting in the rapid dissemination of under-developed and potentially biased models, which may exacerbate the disparities gap. We believe there is an urgent need to enforce the systematic use of reporting standards and develop regulatory frameworks for a shared C… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
50
0
4

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
2
1

Relationship

2
8

Authors

Journals

citations
Cited by 62 publications
(55 citation statements)
references
References 6 publications
1
50
0
4
Order By: Relevance
“…Early models predicting COVID-19 outcomes were mainly developed from non-Hispanic White and Asian patients, reflecting the incidence of disease at the onset of the pandemic. [13][14][15] However, newer data highlight the overrepresentation of Hispanic people among patients who receive a positive test result for SARS-CoV-2. Our work underscores this disproportionate incidence of disease and highlights the trend in test positivity among Hispanic patients.…”
Section: Discussionmentioning
confidence: 99%
“…Early models predicting COVID-19 outcomes were mainly developed from non-Hispanic White and Asian patients, reflecting the incidence of disease at the onset of the pandemic. [13][14][15] However, newer data highlight the overrepresentation of Hispanic people among patients who receive a positive test result for SARS-CoV-2. Our work underscores this disproportionate incidence of disease and highlights the trend in test positivity among Hispanic patients.…”
Section: Discussionmentioning
confidence: 99%
“…During the covid-19 pandemic, demand for rapid response technological interventions might hinder responsible AI design and use. 37 38 In a living systematic review of over 100 covid-19 prediction models for diagnosis and prognosis, Wynants et al have found that owing to the pressure of rushed research, the proposed systems were at high risk of statistical bias, poorly reported, and overoptimistic. Up to this point, the authors have recommended that none of the models be used in medical practice.…”
Section: Equity Under Pressurementioning
confidence: 99%
“…One study found that a major risk-prediction algorithm assigned lower risk scores to Black patients than to White patients with comparable health conditions, attributable to mistakes in the algorithm’s initial design [ 41 ]. There are many more examples of the impact of disparities in research [ 39 , 42 , 43 ], and AI [ 44 , 45 ] and it is crucial to not repeat this oversight when developing a virtual physician.…”
Section: Discussionmentioning
confidence: 99%