Objective: To understand better the public perception and comprehension of medical technology such as artificial intelligence (AI) and robotic surgery. In addition to this, to identify sensitivity to their use to ensure acceptability and quality of counseling. Subjects and Methods: A survey was conducted on a convenience sample of visitors to the MN Minnesota State Fair (n = 264). Participants were randomized to receive one of two similar surveys. In the first, a diagnosis was made by a physician and in the second by an AI application to compare confidence in human and computerbased diagnosis. Results: The median age of participants was 45 (interquartile range 28-59), 58% were female (n = 154) vs 42% male (n = 110), 69% had completed at least a bachelor's degree, 88% were Caucasian (n = 233) vs 12% ethnic minorities (n = 31) and were from 12 states, mostly from the Upper Midwest. Participants had nearly equal trust in AI vs physician diagnoses. However, they were significantly more likely to trust an AI diagnosis of cancer over a doctor's diagnosis when responding to the version of the survey that suggested that an AI could make medical diagnoses (p = 9.32e-06). Though 55% of respondents (n = 145) reported that they were uncomfortable with automated robotic surgery, the majority of the individuals surveyed (88%) mistakenly believed that partially autonomous surgery was already happening. Almost all (94%, n = 249) stated that they would be willing to pay for a review of medical imaging by an AI if available. Conclusion: Most participants express confidence in AI providing medical diagnoses, sometimes even over human physicians. Participants generally express concern with surgical AI, but they mistakenly believe that it is already being performed. As AI applications increase in medical practice, health care providers should be cognizant of the potential amount of misinformation and sensitivity that patients have to how such technology is represented.
Objective: To understand better the public perception and comprehension with medical technology such as artificial intelligence and robotic surgery. Additionally, to identify sensitivity to, and comfort with, the use of AI and robotics in medicine a in order to ensure acceptability and quality of counseling and to guide future development. Subjects and Methods: A survey was conducted on a convenience sample of visitors to the Minnesota State Fair (n = 264). The survey investigated participant beliefs on the capabilities of AI and robotics in medicine and their comfort with such technology. Participants were randomized to receive one of two similar surveys. In the first a diagnosis was made by a physician and in the second by an AI application in order to compare confidence in human and computer-based diagnosis. Results: The median age of participants was 45 (IQR 28-59), 58% were female (n=154) vs. 42% male (n=110), 69% had completed at least a bachelor's degree, 88% were Caucasian (n=233) vs. 12% ethnic minorities (n=31) and were from 12 states in the US with most from the Upper Midwest. Participants had nearly equal trust in AI vs. physician diagnoses, however, they were significantly more likely to trust an AI diagnosis of cancer over a doctor's diagnosis when responding to the version of the survey that suggested an AI could make medical diagnosis (p = 9.32e-06). Though 55% of respondents (n=145) reported they were uncomfortable with automated robotic surgery the majority of the individuals surveyed (88%) mistakenly believed that partially autonomous surgery was already being performed. Almost all (94%) stated they would be willing to pay for an AI to review their medical imaging, if available. Conclusion: Most participants express confidence in AI providing medical diagnoses, sometimes even over human physicians. Participants generally expressed concern with surgical AI, but mistakenly believe it is already happening. As AI applications make their way into medical practice, health care providers should be cognizant of patient misconceptions and the sensitivity that patients have to how such technology is represented.
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.