2022
DOI: 10.3389/fmed.2022.1016366
|View full text |Cite
|
Sign up to set email alerts
|

“Nothing works without the doctor:” Physicians’ perception of clinical decision-making and artificial intelligence

Abstract: IntroductionArtificial intelligence–driven decision support systems (AI–DSS) have the potential to help physicians analyze data and facilitate the search for a correct diagnosis or suitable intervention. The potential of such systems is often emphasized. However, implementation in clinical practice deserves continuous attention. This article aims to shed light on the needs and challenges arising from the use of AI-DSS from physicians’ perspectives.MethodsThe basis for this study is a qualitative content analys… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0
2

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 11 publications
(12 citation statements)
references
References 41 publications
0
10
0
2
Order By: Relevance
“… 42 For these systems to be adopted by physicians, they must be more explainable and adaptable to shared decision-making. 42 …”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“… 42 For these systems to be adopted by physicians, they must be more explainable and adaptable to shared decision-making. 42 …”
Section: Resultsmentioning
confidence: 99%
“…Decision support systems powered by artificial intelligence are designed to help physicians analyze data and improve diagnosis and treatment. 42 For these systems to be adopted by physicians, they must be more explainable and adaptable to shared decision-making. 42 A schematic model for implementing the co-pilot model in diagnosis and treatment is shown in Figure 1B and C.…”
Section: A New Structure For the Department Of Medicinementioning
confidence: 99%
See 1 more Smart Citation
“…This is the first qualitative study to combine a multi-stakeholder approach and an anticipatory ethics approach to analyse the ethical, legal and social issues surrounding the implementation of AI-based tools in healthcare by focusing on the micro-level of healthcare decision-making and not only on groups such as healthcare professionals/workers, 9 , 10 , 12 , 13 , 16 , 17 healthcare leaders 15 or patients, family members and healthcare professionals. 29 Interestingly, the findings have highlighted the impact of AI on: (1) the fundamental aspects of the patient–physician relationship and its underlying core values as well as the need for a synergistic dynamic between the physician and AI; (2) alleviating workload and reducing the administrative burden by saving time and bringing the patient to the centre of the caring process; and (3) the potential loss of a holistic approach by neglecting humanness in healthcare.…”
Section: Discussionmentioning
confidence: 99%
“…Interestingly, healthcare professionals have demonstrated openness and readiness to adopt generative AI, mostly because they are excessively burdened by administrative tasks 8 and are desperately seeking a practical solution. Several medical specialisations have been identified as benefiting from the use of medical AI, including general practitioners, 9 nephrologists, 10 nuclear medicine 11 and pathologists, 12 with the technology reportedly having a direct impact on physicians’ roles, responsibilities, and competencies. 12 14 Although the above-mentioned potential has been recognised, various studies have noted that the implementation of medical AI would bring about certain challenges 15 and barriers, 16 such as physicians’ trust in the AI, user-friendliness 17 or tensions between the human-centric model and technology-centric model, that is, upskilling and deskilling, 18 which will further impact on the (non-)acceptance of AI-based tools.…”
Section: Introductionmentioning
confidence: 99%