Background User experience and engagement are critical elements of mental health apps’ abilities to support users. However, work examining the relationships among user experience, engagement, and popularity has been limited. Understanding how user experience relates to engagement with and popularity of mental health apps can demonstrate the relationship between subjective and objective measures of app use. In turn, this may inform efforts to develop more effective and appealing mental health apps and ensure that they reach wide audiences. Objective We aimed to examine the relationship among subjective measures of user experience, objective measures of popularity, and engagement in mental health apps. Methods We conducted a preregistered secondary data analysis in a sample of 56 mental health apps. To measure user experience, we used expert ratings on the Mobile App Rating Scale (MARS) and consumer ratings from the Apple App Store and Google Play. To measure engagement, we acquired estimates of monthly active users (MAU) and user retention. To measure app popularity, we used download count, total app revenue, and MAU again. Results MARS total score was moderately positively correlated with app-level revenue (Kendall rank [T]=0.30, P=.002), MAU (T=0.39, P<.001), and downloads (T=0.41, P<.001). However, the MARS total score and each of its subscales (Engagement, Functionality, Aesthetics, and Information) showed extremely small correlations with user retention 1, 7, and 30 days after downloading. Furthermore, the total MARS score only correlated with app store rating at T=0.12, which, at P=.20, did not meet our threshold for significance. Conclusions More popular mental health apps receive better ratings of user experience than less popular ones. However, user experience does not predict sustained engagement with mental health apps. Thus, mental health app developers and evaluators need to better understand user experience and engagement, as well as to define sustained engagement, what leads to it, and how to create products that achieve it. This understanding might be supported by better collaboration between industry and academic teams to advance a science of engagement.
Objective Given the increasing number of publicly available mental health apps, we need independent advice to guide adoption. This paper discusses the challenges and opportunities of current mental health app rating systems and describes the refinement process of one prominent system, the One Mind PsyberGuide Credibility Rating Scale (PGCRS). Methods PGCRS Version 1 was developed in 2013 and deployed for 7 years, during which time a number of limitations were identified. Version 2 was created through multiple stages, including a review of evaluation guidelines and consumer research, input from scientific experts, testing, and evaluation of face validity. We then re-reviewed 161 mental health apps using the updated rating scale, investigated the reliability and discrepancy of initial scores, and updated ratings on the One Mind PsyberGuide public app guide. Results Reliabilities across the scale's 9 items ranged from −0.10 to 1.00, demonstrating that some characteristics of apps are more difficult to rate consistently. The average overall score of the 161 reviewed mental health apps was 2.51/5.00 (range 0.33–5.00). Ratings were not strongly correlated with app store star ratings, suggesting that credibility scores provide different information to what is contained in star ratings. Conclusion PGCRS summarizes and weights available information in 4 domains: intervention specificity, consumer ratings, research, and development. Final scores are created through an iterative process of initial rating and consensus review. The process of updating this rating scale and integrating it into a procedure for evaluating apps demonstrates one method for determining app quality.
Digital mental health is often touted as a solution to issues of access to mental health care. However, there has been little research done to understand the accessibility of digital mental health, especially for those with disabilities. In this piece, we define accessibility as it relates to mental health apps, describe the current state of accessibility in the digital world broadly and in mental health apps more specifically, outline why accessibility matters in mental health apps, and identify future steps to better incorporate accessibility into research and development of mental health apps.
Mental health (MH) apps can be used as adjunctive tools in traditional face-to-face therapy to help implement components of evidence-based treatments. However, practitioners interested in using MH apps face a variety of challenges, including knowing which apps would be appropriate to use. Although some resources are available to help practitioners identify apps, granular analyses of how faithfully specific clinical skills are represented in apps are lacking. This study aimed to conduct a review and analysis of MH apps containing a core component of cognitive behaviour therapy (CBT) – cognitive restructuring (CR). A keyword search for apps providing CR functionality on the Apple App and Android Google Play stores yielded 246 apps after removal of duplicates, which was further reduced to 15 apps following verification of a CR component and application of other inclusionary/exclusionary criteria. Apps were coded based on their inclusion of core elements of CR, and general app features including app content, interoperability/data sharing, professional involvement, ethics, and data safeguards. They were also rated on user experience as assessed by the Mobile App Rating Scale (MARS). Whereas a majority of the CR apps include most core CR elements, they vary considerably with respect to more granular sub-elements of CR (e.g. rating the intensity of emotions), other general app features, and user experience (average MARS = 3.53, range from 2.30 to 4.58). Specific apps that fared best with respect to CR fidelity and user experience dimensions are highlighted, and implications of findings for clinicians, researchers and app developers are discussed. Key learning aims (1) To identify no-cost mobile health apps that practitioners can adopt to facilitate cognitive restructuring. (2) To review how well the core elements of cognitive restructuring are represented in these apps. (3) To characterize these apps with respect to their user experience and additional features. (4) To provide examples of high-quality apps that represent cognitive restructuring with fidelity and facilitate its clinical implementation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.