Objective The accuracy of artificial intelligence (AI) in medicine and in pathology in particular has made major progress but little is known on how much these algorithms will influence pathologists’ decisions in practice. The objective of this paper is to determine the reliance of pathologists on AI and to investigate whether providing information on AI impacts this reliance. Materials and Methods The experiment using an online survey design. Under 3 conditions, 116 pathologists and pathology students were tasked with assessing the Gleason grade for a series of 12 prostate biopsies: (1) without AI recommendations, (2) with AI recommendations, and (3) with AI recommendations accompanied by information about the algorithm itself, specifically algorithm accuracy rate and algorithm decision-making process. Results Participant responses were significantly more accurate with the AI decision aids than without (92% vs 87%, odds ratio 13.30, P < .01). Unexpectedly, the provision of information on the algorithm made no significant difference compared to AI without information. The reliance on AI correlated with general beliefs on AI’s usefulness but not with particular assessments of the AI tool offered. Decisions were made faster when AI was provided. Discussion These results suggest that pathologists are willing to rely on AI regardless of accuracy or explanations. Generalization beyond the specific tasks and explanations provided will require further studies. Conclusion This study suggests that the factors that influence the reliance on AI differ in practice from beliefs expressed by clinicians in surveys. Implementation of AI in prospective settings should take individual behaviors into account.
Interactional justice (e.g., empathy) plays a crucial role in service recovery. It relies on human social skills that would prevent it from automation. However, several considerations challenge this view. Interactional justice is not always necessary to recover service, and progress in social robotics enables service robots to handle social interactions. This paper reviews service recovery and social robotics literature and addresses whether service robots can use interactional justice as frontline employees do during service recovery. Results show service robots can replicate interactional justice norms, although with some considerations. Accordingly, we propose a research agenda for future studies.
Background: Artificial intelligence (AI) is rapidly gaining attention in medicine and in pathology in particular. While much progress has been made in refining the accuracy of algorithms, thereby increasing their potential use, we need to better understand how these algorithms will be used by pathologists, who will remain for the foreseeable future the decision-makers. The objective of this paper is to determine the propensity of pathologists to rely on AI decision aids and to investigate whether providing information on the algorithm impacts this reliance.Methods: To test our hypotheses, we conducted an experiment with within-subjects design using an online survey study. 116 respondent pathologists and pathology students participated in the experiment. Each participant was tasked with assessing the Gleason grade for a series of 12 prostate cancer samples under three conditions: without advice, with advice from an AI decision aid, and with advice from an AI decision aid with information provided on the algorithm, namely the algorithm accuracy rate and the algorithm model. Scores were computed by comparing the respondents’ scores with the “true” score at the individual-question level. A mixed effects logistic regression was used to analyze the difference in scores between the different conditions, controlling for the random effects of participants and images and to assess the interactions with Experience, Gender and beliefs towards AI.Results: Participant responses to the questions with AI decision aids were significantly more accurate than the control condition without aid. However, no significant difference was found when subjects were provided with additional accuracy rate and model information on the AI advice. Moreover, the propensity to rely on AI was found to relate to general beliefs on AI but not with particular assessments of the AI tool offered. Males also performed better in the No-aid condition but not in the AI-aid condition.Conclusions: AI can significantly influence pathologists and the general beliefs in AI could be major predictors of future reliance on AI by pathologists.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.