2021
DOI: 10.2196/24045
|View full text |Cite
|
Sign up to set email alerts
|

Clinical Advice by Voice Assistants on Postpartum Depression: Cross-Sectional Investigation Using Apple Siri, Amazon Alexa, Google Assistant, and Microsoft Cortana

Abstract: Background A voice assistant (VA) is inanimate audio-interfaced software augmented with artificial intelligence, capable of 2-way dialogue, and increasingly used to access health care advice. Postpartum depression (PPD) is a common perinatal mood disorder with an annual estimated cost of $14.2 billion. Only a small percentage of PPD patients seek care due to lack of screening and insufficient knowledge of the disease, and this is, therefore, a prime candidate for a VA-based digital health interven… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
36
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 52 publications
(37 citation statements)
references
References 22 publications
1
36
0
Order By: Relevance
“…Similar to our previous paper (7), some limitations should be noted. The current study did not assess the usefulness or safety of the medical information given, as some other previous research has examined (2)(3)(4)(5)(6), although this was not the primary purpose of the study. Future research could investigate the implications of errors when comprehending medication names during interactions with different patient conditions, symptoms, contraindications, and side effects.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Similar to our previous paper (7), some limitations should be noted. The current study did not assess the usefulness or safety of the medical information given, as some other previous research has examined (2)(3)(4)(5)(6), although this was not the primary purpose of the study. Future research could investigate the implications of errors when comprehending medication names during interactions with different patient conditions, symptoms, contraindications, and side effects.…”
Section: Discussionmentioning
confidence: 99%
“…Intelligent virtual (or voice) assistants (IVA), such as Amazon Alexa (hereinafter referred to as Alexa), Google Assistant, and Apple Siri (hereinafter referred to as Siri), are popular artificial intelligence (AI) software programs designed to simulate human conversation and perform web-based searches and other commands (1). Previous research has also investigated the use of these devices to gather health information and give medically related suggestions for mental and physical health inquiries (2)(3)(4)(5)(6). However, these findings have revealed that IVAs generally provide poor, inconsistent, and potentially harmful advice to users.…”
Section: Introductionmentioning
confidence: 99%
“…Parental familiarity and use of voice assistants via smartphones and smart speakers are a promising indicator toward future utilization of voice interaction in care management. It will need to be built integrated with current healthcare technologies, moving the needle from voice interaction being primarily used for health seeking activities ( 21 , 35 ) and health screening ( 20 , 36 ) to the area of care management.…”
Section: Discussionmentioning
confidence: 99%
“…Voice interactive technologies (e.g., voice assistants) and ASR algorithms have been improving over the years. They enable users to command and interact with digital tools using speech and dialogue mechanisms and show promise for a variety of healthcare uses (20)(21)(22)(23). In our earlier work (18), we prototyped the SpeakHealth app and collected feedback from parents and healthcare providers which informed the design and features of the app.…”
Section: Introductionmentioning
confidence: 99%
“…Several studies have now demonstrated the potential safety risks when consumers and patients use CAs for medical information and act on it without further consultation with health care professionals. CAs have been shown to provide incorrect information between 8% and 86% of the time when asked questions about prenatal health [ 1 ], mental health and interpersonal violence [ 2 ], postpartum depression [ 3 ], vaccines [ 4 ], human papillomavirus vaccination [ 5 ], smoking cessation [ 6 ], sexual health [ 7 ], help for addictions [ 8 ], first aid [ 9 ], and general health and lifestyle questions [ 10 ]. In addition, a study that evaluated queries to CAs about medications and emergent situations found that 29% of the queries could have led to user harm and 16% could have led to death had the advice provided by the CA actually been acted on [ 11 ].…”
Section: Introductionmentioning
confidence: 99%