2021
DOI: 10.2196/24343
|View full text |Cite
|
Sign up to set email alerts
|

Emotional Reactions and Likelihood of Response to Questions Designed for a Mental Health Chatbot Among Adolescents: Experimental Study

Abstract: Background Psychological distress increases across adolescence and has been associated with several important health outcomes with consequences that can extend into adulthood. One type of technological innovation that may serve as a unique intervention for youth experiencing psychological distress is the conversational agent, otherwise known as a chatbot. Further research is needed on the factors that may make mental health chatbots destined for adolescents more appealing and increase the likelihoo… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
16
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 21 publications
(17 citation statements)
references
References 42 publications
1
16
0
Order By: Relevance
“…Somewhat removed from typical workflows, we can find CDS tools, such as chatbots or technological conversation agents. 32 , 33 These are automated systems that patients or clinicians may engage with through a “conversation.” Through these conversations, the tool can gather data and be utilized as a screening system or provide educational content to the user. 34 …”
Section: Types Of Cds Toolsmentioning
confidence: 99%
“…Somewhat removed from typical workflows, we can find CDS tools, such as chatbots or technological conversation agents. 32 , 33 These are automated systems that patients or clinicians may engage with through a “conversation.” Through these conversations, the tool can gather data and be utilized as a screening system or provide educational content to the user. 34 …”
Section: Types Of Cds Toolsmentioning
confidence: 99%
“…While many popular mental health chatbots exist, few studies have reported on how user groups can contribute to co-design as it is important to consider the user needs when designing content and features for this application. A few recent studies have involved young people in the design process to co-develop mental health and wellbeing chatbots targeted at under 18 s (Audrey et al, 2021 ; Grové, 2021 ). Another study by Easton et al reported on co-designing content for a health chatbot by involving patients with lived experiences (Easton et al, 2019 ).…”
Section: Introductionmentioning
confidence: 99%
“…Confidentiality and data protection [33], [39], [82] Regulatory compliance [33], [39], [83], [82] Localization [51], [83] Trust and transparency [84][85][86] Purposeful and goaloriented designs Personalization and choices [32], [36], [76], [83] Ease of access [54], [58], [87] Participation, engagement, and collaborative design [36], [48], [85][86]…”
Section: Ethical Design and Governancementioning
confidence: 99%
“…The use of AI technology in public services is a controversial topic, which necessitates careful consideration of the issues of trust, fairness, and transparency, which should aim to gain citizens' confidence and trust towards the user interface, the technology platform, and the purpose [78], [84][85][86]. Additionally, ethical designs of mental health chatbots require them to be sensitive towards sub-cultural differences and localization aspects, for example, multi-lingual conversational ability [51], [83].…”
Section: A Ethical Design and Governancementioning
confidence: 99%
See 1 more Smart Citation