2024
DOI: 10.1007/s00508-024-02329-1
|View full text |Cite
|
Sign up to set email alerts
|

Delayed diagnosis of a transient ischemic attack caused by ChatGPT

Jonathan A. Saenger,
Jonathan Hunger,
Andreas Boss
et al.

Abstract: SummaryTechniques of artificial intelligence (AI) are increasingly used in the treatment of patients, such as providing a diagnosis in radiological imaging, improving workflow by triaging patients or providing an expert opinion based on clinical symptoms; however, such AI techniques also hold intrinsic risks as AI algorithms may point in the wrong direction and constitute a black box without explaining the reason for the decision-making process.This article outlines a case where an erroneous ChatGPT diagnosis,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(4 citation statements)
references
References 7 publications
0
4
0
Order By: Relevance
“…present a case report of a patient who relied on ChatGPT inaccurate diagnosis, and that resulted in a substantial delay in treatment and a potentially life-threatening circumstance [49]. These findings raise important clinical and ethical considerations, especially in the sensitive setting of a PICU, where decisions can have significant consequences.…”
Section: Discussionmentioning
confidence: 90%
See 2 more Smart Citations
“…present a case report of a patient who relied on ChatGPT inaccurate diagnosis, and that resulted in a substantial delay in treatment and a potentially life-threatening circumstance [49]. These findings raise important clinical and ethical considerations, especially in the sensitive setting of a PICU, where decisions can have significant consequences.…”
Section: Discussionmentioning
confidence: 90%
“…The ability of ChatGPT to provide appropriate and equitable medical advice was evaluated by Nastasi et al, as they presented ChatGPT with various clinical vignettes and found that while ChatGPT’s responses were largely in line with clinical guidelines, it did not consistently offer personalized medical advice [48]. AI algorithms carry inherent dangers that may lead to incorrect conclusions and operate as “black box” systems that do not elucidate the rationale behind decision-making [49]. Saenger et al present a case report of a patient who relied on ChatGPT inaccurate diagnosis, and that resulted in a substantial delay in treatment and a potentially life-threatening circumstance [49].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Paramount concerns include the propensity of LLMs to disseminate inadequate information, the input of sensitive health information or patient data, which raises significant privacy issues 24 , and the perpetuation of harmful gender, cultural or racial biases 27 30 , well known from machine learning algorithms 31 , especially in healthcare 32 . Case reports have documented that ChatGPT has already caused actual damage, potentially life-threatening for patients 33 .…”
Section: Introductionmentioning
confidence: 99%