Background In recent years, the considerable increase in the number of mobile health (mHealth) apps has made health care more accessible and affordable for all. However, the exponential growth in mHealth solutions has occurred with almost no control or regulation of any kind. Despite some recent initiatives, there is still no specific regulation procedure, accreditation system, or standards to help the development of the apps, mitigate risks, or guarantee quality. Objective The main aim of this study was to propose a set of criteria for mHealth-related apps on the basis of what is available from published studies, guidelines, and standards in the various areas that are related to health app development. Methods We used three sources of information to identify the most important criteria. First, we conducted a systematic review of all the studies published on pain-related apps. Second, we searched for health app recommendations on the websites of professional organizations. Third, we looked for standards governing the development of software for medical devices on the specialized websites of regulatory organizations. Then, we grouped and subsumed the criteria we had identified on the basis of their shared characteristics. Finally, the comprehensibility and perceived importance of the resulting criteria were evaluated for face validity with a group of 18 stakeholders. Results We identified a total of 503 criteria from all sources, which, after close analysis, were grouped into eight different categories, including 36 important criteria for health apps. The resulting categories were usability, privacy, security, appropriateness and suitability, transparency and content, safety, technical support and updates, and technology. The results of the preliminary analysis showed that the criteria were mostly understood by the group of stakeholders. In addition, they perceived all of them as important. Conclusions This set of criteria can help health care providers, developers, patients, and other stakeholders to guide the development of mHealth-related apps and, potentially, to measure the quality of an mHealth app.
Background In recent years, there has been an exponential growth of mobile health (mHealth)–related apps. This has occurred in a somewhat unsupervised manner. Therefore, having a set of criteria that could be used by all stakeholders to guide the development process and the assessment of the quality of the apps is of most importance. Objective The aim of this paper is to study the validity of the Mobile App Development and Assessment Guide (MAG), a guide recently created to help stakeholders develop and assess mobile health apps. Methods To conduct a validation process of the MAG, we used the Delphi method to reach a consensus among participating stakeholders. We identified 158 potential participants: 45 patients as potential end users, 41 health care professionals, and 72 developers. We sent participants an online survey and asked them to rate how important they considered each item in the guide to be on a scale from 0 to 10. Two rounds were enough to reach consensus. Results In the first round, almost one-third (n=42) of those invited participated, and half of those (n=24) also participated in the second round. Most items in the guide were found to be important to a quality mHealth-related app; a total of 48 criteria were established as important. “Privacy,” “security,” and “usability” were the categories that included most of the important criteria. Conclusions The data supports the validity of the MAG. In addition, the findings identified the criteria that stakeholders consider to be most important. The MAG will help advance the field by providing developers, health care professionals, and end users with a valid guide so that they can develop and identify mHealth-related apps that are of quality.
Background There is a huge number of health-related apps available, and the numbers are growing fast. However, many of them have been developed without any kind of quality control. In an attempt to contribute to the development of high-quality apps and enable existing apps to be assessed, several guides have been developed. Objective The main aim of this study was to study the interrater reliability of a new guide — the Mobile App Development and Assessment Guide (MAG) — and compare it with one of the most used guides in the field, the Mobile App Rating Scale (MARS). Moreover, we also focused on whether the interrater reliability of the measures is consistent across multiple types of apps and stakeholders. Methods In order to study the interrater reliability of the MAG and MARS, we evaluated the 4 most downloaded health apps for chronic health conditions in the medical category of IOS and Android devices (ie, App Store and Google Play). A group of 8 reviewers, representative of individuals that would be most knowledgeable and interested in the use and development of health-related apps and including different types of stakeholders such as clinical researchers, engineers, health care professionals, and end users as potential patients, independently evaluated the quality of the apps using the MAG and MARS. We calculated the Krippendorff alpha for every category in the 2 guides, for each type of reviewer and every app, separately and combined, to study the interrater reliability. Results Only a few categories of the MAG and MARS demonstrated a high interrater reliability. Although the MAG was found to be superior, there was considerable variation in the scores between the different types of reviewers. The categories with the highest interrater reliability in MAG were “Security” (α=0.78) and “Privacy” (α=0.73). In addition, 2 other categories, “Usability” and “Safety,” were very close to compliance (health care professionals: α=0.62 and 0.61, respectively). The total interrater reliability of the MAG (ie, for all categories) was 0.45, whereas the total interrater reliability of the MARS was 0.29. Conclusions This study shows that some categories of MAG have significant interrater reliability. Importantly, the data show that the MAG scores are better than the ones provided by the MARS, which is the most commonly used guide in the area. However, there is great variability in the responses, which seems to be associated with subjective interpretation by the reviewers.
BACKGROUND In recent years, the considerable increase in the number of mobile health apps has made healthcare more accessible and affordable for all. However, the exponential growth in mHealth solutions has occurred with almost no control or regulation of any kind. Despite some recent initiatives, there is still no specific regulation procedure, accreditation system or standards to help the development of the apps, mitigate risks or guarantee quality. OBJECTIVE The main aim of this study is to provide a set of standards for mobile health-related apps on the basis of what is available from guidelines, frameworks, and standards in the field of health app development. METHODS To identify the most important criteria, we used three strategies. First, we conducted a systematic review of all the studies published on health-related apps. Second, we searched for health-app recommendations on the websites of professional organizations. Finally, we looked for standards governing the development of software for medical devices on the specialized webs of regulatory organizations. Then, we compiled the criteria we had identified and determined which of them could be regarded as essential, recommendable or desirable. RESULTS We identified a total of 168 criteria from the systematic review, 282 criteria from published guidelines, and 53 criteria from the standards of medical devices. These criteria were then grouped and subsumed under 8 categories, which included 36 important criteria for health apps. Of these 7 were considered to be essential, 18 recommendable, and 11 desirable. The more essential criteria an mHealth application has, the greater its quality. CONCLUSIONS This set of standards can be easily used by health care providers, developers, patients and other stakeholders, both to guide the development of mHealth related apps and to measure the quality of an mHealth app.
Treatment for individuals with fibromyalgia syndrome (FMS) is complex and is not always accessible to those who could benefit. The aim of this study was to conduct a preliminary evaluation of a mobile-app-delivered, cognitive behavioral treatment (CBT)-based intervention in helping adults self-manage fibromyalgia symptoms. A total of 100 adults with FMS (M [SD] age = 49.81, [9.99] years; 94% women) were given access to the digital treatment program and downloaded the app. Pain severity, anxiety symptoms, depression symptoms, fatigue, and sleep quality were assessed at pre-treatment, post-treatment, and 3-month follow-up. Fifty-three of the potential participants completed the 47-day treatment. Data showed significant improvements in pain severity (p = 0.007, d = 0.43), anxiety (p = 0.011, d = 0.40) and depressive symptoms (p = 0.001, d = 0.50) from pre-treatment to post-treatment. The effect sizes associated with app use are consistent with improvements seen in previously published clinical trials of CBT for FMS. Improvements were generally maintained, although there was some decrease in the outcomes from post-treatment to the 3-month follow-up. Most participants reported that they were very satisfied with the app. The use of the app was associated with similar levels of improvements found with in-person CBT treatment for FMS. Research to evaluate the effectiveness of the app in a controlled trial is warranted.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.