2023
DOI: 10.1001/jamapediatrics.2023.4215
|View full text |Cite
|
Sign up to set email alerts
|

AI as a Mental Health Therapist for Adolescents

Douglas J. Opel,
Brent M. Kious,
I. Glenn Cohen

Abstract: This Viewpoint discusses benefits and risks of using conversational artificial intelligence platforms to deliver psychotherapy to adolescents.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 9 publications
0
3
0
Order By: Relevance
“…5 When these brain-computer interfaces become bidirectional, they can provide somatosensory feedback, including perceptions of pressure or warmth, or even disable fear, and could also be used for the enhancement of able-bodied individuals (Jecker and Ko, 2022a, b). Recently, generative AI for the creation of visual and performance art, as well as for multiple medical fields (especially radiology, mental health, and drug development), is being explored (Rajpurkar and Lungren, 2023;Howell et al, 2024;Rengers et al, 2024;Opel et al, 2023). Here, the concepts of non-human agency or "assemblages" of human and non-human (Lupton, 2019) become particularly evident.…”
Section: Being As Co-being and Actions As Co-actionsmentioning
confidence: 99%
“…5 When these brain-computer interfaces become bidirectional, they can provide somatosensory feedback, including perceptions of pressure or warmth, or even disable fear, and could also be used for the enhancement of able-bodied individuals (Jecker and Ko, 2022a, b). Recently, generative AI for the creation of visual and performance art, as well as for multiple medical fields (especially radiology, mental health, and drug development), is being explored (Rajpurkar and Lungren, 2023;Howell et al, 2024;Rengers et al, 2024;Opel et al, 2023). Here, the concepts of non-human agency or "assemblages" of human and non-human (Lupton, 2019) become particularly evident.…”
Section: Being As Co-being and Actions As Co-actionsmentioning
confidence: 99%
“…Using the example of conversational artificial intelligence (CAI) applications such as chatbots, it is important to ensure that established biases against marginalized populations are considered as CAI models are trained (Opel et al, 2023). A recent review of the presence of racial bias in clinical ML methods suggested that more consistent adoption of algorithmic fairness principles in medicine is required and that standards for data and ML model reporting availability require more attention (Huang et al, 2022).…”
Section: Using Artificial Intelligence To Help Mitigate Challenges In...mentioning
confidence: 99%
“…The COVID‐19 pandemic exacerbated this public health crisis, with a skyrocketing incidence of EDs and unprecedented spikes in hospital admissions and emergency room visits noted in adolescents globally (Devoe et al, 2023; Spettigue et al, 2021). Higher rates of patient presentations have contributed to already overburdened and inequitable mental health (MH) systems, reaching a critical point in which alternative solutions are required to address the mismatch between patient needs and available services (Opel et al, 2023).…”
Section: Introductionmentioning
confidence: 99%