2020
DOI: 10.1016/j.artmed.2020.101965
|View full text |Cite
|
Sign up to set email alerts
|

The automation of bias in medical Artificial Intelligence (AI): Decoding the past to create a better future

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
37
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 52 publications
(37 citation statements)
references
References 6 publications
0
37
0
Order By: Relevance
“…Deep learning (DL) has been extensively documented to propagate health care disparities and biases, mostly through the use of biased training data, limiting its generalizability [11]. Conversely, it is possible to use DL algorithms to detect such disparities.…”
Section: Introductionmentioning
confidence: 99%
“…Deep learning (DL) has been extensively documented to propagate health care disparities and biases, mostly through the use of biased training data, limiting its generalizability [11]. Conversely, it is possible to use DL algorithms to detect such disparities.…”
Section: Introductionmentioning
confidence: 99%
“…The author proposes, going forward, to decode the present and reshape existing practices before implementing AI to avoid existing biases and further increasing health disparities. 59 Colling et al propose a UK-wide strategy for AI and DP. If the requirements of proper slide image management software, integrated reporting systems, improved scanning speeds, and high-quality images for DP systems are achieved then it will provide time and cost saving benefits over the traditional microscope based pathology approach and reduce problem of inter-observer variation.…”
Section: Ai -Issues To Be Resolvedmentioning
confidence: 99%
“…In 2021, the UK parliamentary report on the gender health gap highlighted that the UK has the largest female health gap in the G20 and the 12th largest globally 5. The exclusion of females from research trials (extending to animal research), the neglect of female bodies throughout medical pedagogy and the unconscious biases of practitioners are a few of the intersecting factors that result in worse health outcomes for female patients 6–10…”
Section: Introductionmentioning
confidence: 99%
“…These ‘biochemical markers’ include proteins made by the liver (eg, albumin), and enzymes required for metabolism (eg, aspartate aminotransferase (AST)). Bias research has illustrated that biochemical markers are not equally effective for all patient groups 3 7 10–12. Suthahar et al describe how sex differences in biomarker thresholds affect objectivity in management, as what is considered ‘normal’ in one sex, may not be so in the other 12.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation