2023
DOI: 10.1111/his.15071
|View full text |Cite
|
Sign up to set email alerts
|

Why do errors arise in artificial intelligence diagnostic tools in histopathology and how can we minimize them?

Harriet Evans,
David Snead

Abstract: Artificial intelligence (AI)‐based diagnostic tools can offer numerous benefits to the field of histopathology, including improved diagnostic accuracy, efficiency and productivity. As a result, such tools are likely to have an increasing role in routine practice. However, all AI tools are prone to errors, and these AI‐associated errors have been identified as a major risk in the introduction of AI into healthcare. The errors made by AI tools are different, in terms of both cause and nature, to the errors made … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
6
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(6 citation statements)
references
References 63 publications
0
6
0
Order By: Relevance
“…Identifying order sets and the algorithmic rules within AI has proven to be notably challenging [ 57 , 126 ]. Unexperienced users may develop workarounds that compromise data which leads to jeopardizing data integrity [ 25 , 127 ]. Tissue sample size could also represent a potential issue (i.e., core biopsies), as well as the subjectivity of “hotspot” selection for training the algorithm [ 86 , 127 ].…”
Section: Pitfalls and Prospectivesmentioning
confidence: 99%
See 1 more Smart Citation
“…Identifying order sets and the algorithmic rules within AI has proven to be notably challenging [ 57 , 126 ]. Unexperienced users may develop workarounds that compromise data which leads to jeopardizing data integrity [ 25 , 127 ]. Tissue sample size could also represent a potential issue (i.e., core biopsies), as well as the subjectivity of “hotspot” selection for training the algorithm [ 86 , 127 ].…”
Section: Pitfalls and Prospectivesmentioning
confidence: 99%
“…Unexperienced users may develop workarounds that compromise data which leads to jeopardizing data integrity [ 25 , 127 ]. Tissue sample size could also represent a potential issue (i.e., core biopsies), as well as the subjectivity of “hotspot” selection for training the algorithm [ 86 , 127 ]. The presence of artifacts stemming from sampling, slide preparation, and slide digitalization can impede computational analysis and lead to erroneous data interpretation.…”
Section: Pitfalls and Prospectivesmentioning
confidence: 99%
“…Inadequate understanding of how artificial intelligence models work is a key contributing factor to automation bias [106]. An AI-recommended diagnosis may convince a pathologist to alter a correct diagnosis into an incorrect diagnosis suggested by an AI-based tool [107]. Pathologists need to be informed that AI tools make predictions based on large amounts of mathematical computations, and that these predictions are prone to error.…”
Section: Challenges and Limitationsmentioning
confidence: 99%
“…Pathologists need to be informed that AI tools make predictions based on large amounts of mathematical computations, and that these predictions are prone to error. Adding a confidence score to predictions may help a pathologist focus on cases where an AI tool is "unsure" of its recommendations [107]. On the other hand, an AI tool may also associate a high confidence score with a wrong diagnosis, so it would be advisable for a pathologist to consult other experts when faced with difficult cases.…”
Section: Challenges and Limitationsmentioning
confidence: 99%
“…Strengthening the connection between pathological knowledge generation and the computational field is crucial. In the literature, the lack of communication between pathologists and other scientists can be seen in various examples, including relevant diagnostic classes being missed in the computational training setup (known as hidden stratification), assembly of large case series that fail to represent emerging entity subtypes reinforcing outdated practices, and computational methods not being tested for relevant mimickers and differential diagnoses, to mention a few [ 18 , 19 ]. Pathologists should commit themselves to the AI‐driven transition and regain their central role as tissue experts and evaluators of digital and computational methods.…”
mentioning
confidence: 99%