Proceedings of the 20th ACM International Conference on Intelligent Virtual Agents 2020
DOI: 10.1145/3383652.3423864
|View full text |Cite
|
Sign up to set email alerts
|

Empathic Chatbot Response for Medical Assistance

Abstract: Is it helpful for a medical physical health chatbot to show empathy? How can a chatbot show empathy only based on short-term text conversations? We have investigated these questions by building two different medical assistant chatbots with the goal of providing a diagnosis for physical health problem to the user based on a short conversation. One chatbot was advice-only and asked only the necessary questions for the diagnosis without responding to the user's emotions. Another chatbot, capable of showing empath… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 16 publications
(3 citation statements)
references
References 10 publications
0
3
0
Order By: Relevance
“…Nonetheless, empathy display and relational behavior are significant research themes in dialog systems development and robotics ( Kennedy et al, 2012 ; Liu and Sundar, 2018 ; Pepito et al, 2020 ; Kerruish, 2021 ). Studies with patients have shown that most people prefer medical assistant chatbots that mimic empathy ( Amini et al, 2013 ; Liu and Sundar, 2018 ; Daher et al, 2020 ), this is particularly true for users who are initially skeptical about machines possessing social cognitive capabilities. However, research in Korea ( Yun et al, 2021 ) shows there is a discrepancy between expressed behavioral intentions toward medical AI and implicit attitudes (detected in brain scans) which shows people respond differently to the same conversation if it is delivered by a human doctor or medical AI.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Nonetheless, empathy display and relational behavior are significant research themes in dialog systems development and robotics ( Kennedy et al, 2012 ; Liu and Sundar, 2018 ; Pepito et al, 2020 ; Kerruish, 2021 ). Studies with patients have shown that most people prefer medical assistant chatbots that mimic empathy ( Amini et al, 2013 ; Liu and Sundar, 2018 ; Daher et al, 2020 ), this is particularly true for users who are initially skeptical about machines possessing social cognitive capabilities. However, research in Korea ( Yun et al, 2021 ) shows there is a discrepancy between expressed behavioral intentions toward medical AI and implicit attitudes (detected in brain scans) which shows people respond differently to the same conversation if it is delivered by a human doctor or medical AI.…”
Section: Resultsmentioning
confidence: 99%
“…← Robot/artificial emotional response behaviours (artificial empathy) (Kennedy et al, 2012;Pepito et al, 2020;Kerruish, 2021;Montemayor et al, 2021) ← Empathetic chatbots (Amini et al, 2013;Liu and Sundar, 2018;Daher et al, 2020) ← Empathetic medical conversations (Yun et al, 2021) ← Web app that provides cancer disease related information to patients (Papadakos et al, 2017) ← AI-generated diagnosis information for radiology patients (Zhang et al, 2021) Health coaching (11 articles) ← Virtual health coaches (Kennedy et al, 2012;Bevilacqua et al, 2020), smoking cessation (He et al, 2022), weight-loss (Stein and Brooks, 2017), self-management of depression (Inkster et al, 2018), and chronic disease self-management (Hernandez, 2019) ← Therapeutic chatbots for mental health (Lee et al, 2019;Valtolina and Hu, 2021) ← Automated healthcare quality assessment e.g., sentiment analysis of patient feedback from diverse groups of service users (Doing- Harris et al, 2017;Rahim et al, 2021) ← Automated analysis of patient and family feedback captured by interactive patient care technology in hospitals (Clavelle et al, 2019) ← Automated analysis of online health communities to inform policy for patient self-care (Panzarasa et al, 2020) ← Automated evaluation of psychotherapy services linked to training, supervision, and quality assurance (Flemotomos et al, 2022;Xiao et al, 2015).…”
Section: Empathetic Awareness (15 Articles)mentioning
confidence: 99%
“…Perceptions of chatbots as less serious compared with real health care professionals might result in skepticism and reduced reliance on chatbot-driven information. A previous study discerned a preference for advice-only chatbots over empathic ones for self-diagnosis [ 36 ]. The potential lack of empathy perceived by chatbots could contribute to a diminished level of trust, highlighting the ongoing debate regarding the effectiveness of chatbots in web-based diagnosis and interventions.…”
Section: Discussionmentioning
confidence: 99%