Background Digital mental health interventions (DMHIs) have been increasingly deployed to bridge gaps in mental health care, particularly given their promising efficacy. Nevertheless, attrition among DMHI users remains high. In response, human support has been studied as a means of improving retention to and outcomes of DMHIs. Although a growing number of studies and meta-analyses have investigated the effects of human support for DMHIs on mental health outcomes, systematic empirical evidence of its effectiveness across mental health domains remains scant. Objective We aimed to summarize the results of meta-analyses of human support versus no support for DMHI use across various outcome domains, participant samples, and support providers. Methods We conducted a systematic meta-review of meta-analyses, comparing the effects of human support with those of no support for DMHI use, with the goal of qualitatively summarizing data across various outcome domains, participant samples, and support providers. We used MEDLINE, PubMed, and PsycINFO electronic databases. Articles were included if the study had a quantitative meta-analysis study design; the intervention targeted mental health symptoms and was delivered via a technology platform (excluding person-delivered interventions mediated through telehealth, text messages, or social media); the outcome variables included mental health symptoms such as anxiety, depression, stress, posttraumatic stress disorder symptoms, or a number of these symptoms together; and the study included quantitative comparisons of outcomes in which human support versus those when no or minimal human support was provided. Results The results of 31 meta-analyses (505 unique primary studies) were analyzed. The meta-analyses reported 45 effect sizes; almost half (n=22, 48%) of them showed that human-supported DMHIs were significantly more effective than unsupported DMHIs. A total of 9% (4/45) of effect sizes showed that unsupported DMHIs were significantly more effective. No clear patterns of results emerged regarding the efficacy of human support for the outcomes assessed (including anxiety, depression, posttraumatic stress disorder, stress, and multiple outcomes). Human-supported DMHIs may be more effective than unsupported DMHIs for individuals with elevated mental health symptoms. There were no clear results regarding the type of training for those providing support. Conclusions Our findings highlight the potential of human support in improving the effects of DMHIs. Specifically, evidence emerged for stronger effects of human support for individuals with greater symptom severity. There was considerable heterogeneity across meta-analyses in the level of detail regarding the nature of the interventions, population served, and support delivered, making it difficult to draw strong conclusions regarding the circumstances under which human support is most effective. Future research should emphasize reporting detailed descriptions of sample and intervention characteristics and describe the mechanism through which they believe the coach will be most useful for the DMHI.
BACKGROUND Human support has been increasingly studied as a means of improving outcomes of digital mental health interventions (DMHIs). Although a growing number of studies and meta-analyses have investigated the effects of human support for DMHIs on mental health outcomes, systematic empirical evidence for its effectiveness across mental health domains remains scant. OBJECTIVE The goal of this study was to summarize the effects of human support and no support for DMHI use across various outcome domains, participant samples, and support providers. METHODS A systematic meta-review of meta-analyses was conducted, comparing the effects of human support to no support for DMHI use, with the goal of summarizing across various outcome domains, participant samples, and support providers. Only meta-analyses that compared effect size of human support with that of no support were included. RESULTS Results from 31 meta-analyses representing 505 unique primary studies were reported. The quality of the evidence for effectiveness of human support was assessed using the AMSTAR 2 review system. The meta-analyses reported 45 effect sizes; almost half (48.89%, N=22) showed that human-supported interventions were significantly more effective than unsupported interventions. Only four (8.89%) effect sizes showed that unsupported were significantly more effective. Evidence emerged for stronger effects of human support for individuals with greater symptom severity. CONCLUSIONS There was considerable heterogeneity across meta-analyses in the level of detail regarding the nature of the interventions, population served, and support delivered. The results highlight the potential of human support in improving the effects of DMHIs as a way of increasing access to evidence-based mental health tools. We conclude with suggestions for future research.
COVID-19 forced college administrators to reassess how they provide students with the most effective methods of support. This project examined the first year of a novel digital peer mentoring program with the goal of connecting diverse students to campus resources they needed to navigate the transition to and through their first year of college. MentorHub, a referral and supportive accountability mobile application, was implemented with first-year undergraduates at a large, private university in the northeastern region of the USA. MentorHub tracked students’ current challenges and connected them with trained peer mentors who provided students with support and referrals to campus resources (e.g., mental health, financial, academic). Analyses were not hypothesis-driven, but instead were exploratory and intended for improving the platform. In the first year of the program (August 2021 to June 2022), 47% ( N = 3141) students logged onto the platform at least once. Patterns of self-reported challenges revealed that career concerns were the most challenging at the beginning of the fall semester, and that academic habits were most challenging over the course of the year. Referrals ( N = 756) were made by mentors, 13% of which were for health and well-being. First-generation and underrepresented minority students showed distinct patterns in referrals. Findings revealed distinct patterns in self-reported challenges across the academic year. Students’ use of MentorHub and responses to in-app questions allowed for a real-time understanding of student challenges and patterns of engagement with peer mentors. Implications for a stepped-care approach to addressing student challenges are discussed. Supplementary Information The online version contains supplementary material available at 10.1007/s41347-023-00303-8.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.