2018
DOI: 10.2196/10148
|View full text |Cite
|
Sign up to set email alerts
|

Towards an Artificially Empathic Conversational Agent for Mental Health Applications: System Design and User Perceptions

Abstract: BackgroundConversational agents cannot yet express empathy in nuanced ways that account for the unique circumstances of the user. Agents that possess this faculty could be used to enhance digital mental health interventions.ObjectiveWe sought to design a conversational agent that could express empathic support in ways that might approach, or even match, human capabilities. Another aim was to assess how users might appraise such a system.MethodsOur system used a corpus-based approach to simulate expressed empat… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

4
111
0
1

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 188 publications
(116 citation statements)
references
References 29 publications
4
111
0
1
Order By: Relevance
“…Chatbots in health care may have the potential to provide patients with access to immediate medical information, recommend diagnoses at the first sign of illness, or connect patients with suitable health care providers (HCPs) across their community [13,14]. Theoretically, in some instances, chatbots may be better suited to help patient needs than a human physician because they have no biological gender, age, or race and elicit no bias toward patient demographics.…”
Section: Introductionmentioning
confidence: 99%
“…Chatbots in health care may have the potential to provide patients with access to immediate medical information, recommend diagnoses at the first sign of illness, or connect patients with suitable health care providers (HCPs) across their community [13,14]. Theoretically, in some instances, chatbots may be better suited to help patient needs than a human physician because they have no biological gender, age, or race and elicit no bias toward patient demographics.…”
Section: Introductionmentioning
confidence: 99%
“…Some systems also leverage unconstrained natural language input to index health advice but do not frame the interaction as a conversation. Kokobot is a conversational agent that facilitates interactions among users of an online peer-to-peer social support platform designed to promote emotional resilience [ 27 ]. Users are prompted to describe stressful situations and associated negative thoughts, and Kokobot responds to these submissions by retrieving and repurposing statements from a corpus of supportive statements previously submitted to Koko by other users.…”
Section: Introductionmentioning
confidence: 99%
“…Their efficacy is limited because digital tools often fail to motivate and engage users [ 13 , 14 ]. They also currently lack the human-level intelligence required to address nuanced problems [ 15 , 16 ]. It appears that until realistic artificial intelligence is available, many people require human-delivered interventions to meet their preferences, engage them, and respond to their unique concerns.…”
Section: Introductionmentioning
confidence: 99%