Digital health interventions may improve different behaviours. However, the rapid proliferation of technological solutions often does not allow for a correct assessment of the quality of the tools. This study aims to review and assess the quality of the available mobile applications (apps) related to interventions for low back pain. Two reviewers search the official stores of Android (Play Store) and iOS (App Store) for localisation in Spain and the United Kingdom, in September 2019, searching for apps related to interventions for low back pain. Seventeen apps finally are included. The quality of the apps is measured using the Mobile App Rating Scale (MARS). The scores of each section and the final score of the apps are retrieved and the mean and standard deviation obtained. The average quality ranges between 2.83 and 4.57 (mean 3.82) on a scale from 1 (inadequate) to 5 (excellent). The best scores are found in functionality (4.7), followed by aesthetic content (mean 4.1). Information (2.93) and engagement (3.58) are the worst rated items. Apps generally have good overall quality, especially in terms of functionality and aesthetics. Engagement and information should be improved in most of the apps. Moreover, scientific evidence is necessary to support the use of applied health tools.
Background There are several mobile health (mHealth) apps in mobile app stores. These apps enter the business-to-customer market with limited controls. Both, apps that users use autonomously and those designed to be recommended by practitioners require an end-user validation to minimize the risk of using apps that are ineffective or harmful. Prior studies have reviewed the most relevant aspects in a tool designed for assessing mHealth app quality, and different options have been developed for this purpose. However, the psychometric properties of the mHealth quality measurement tools, that is, the validity and reliability of the tools for their purpose, also need to be studied. The Consensus-based Standards for the Selection of Health Measurement Instruments (COSMIN) initiative has developed tools for selecting the most suitable measurement instrument for health outcomes, and one of the main fields of study was their psychometric properties. Objective This study aims to address and psychometrically analyze, following the COSMIN guideline, the quality of the tools that are used to measure the quality of mHealth apps. Methods From February 1, 2019, to December 31, 2019, 2 reviewers searched PubMed and Embase databases, identifying mHealth app quality measurement tools and all the validation studies associated with each of them. For inclusion, the studies had to be meant to validate a tool designed to assess mHealth apps. Studies that used these tools for the assessment of mHealth apps but did not include any psychometric validation were excluded. The measurement tools were analyzed according to the 10 psychometric properties described in the COSMIN guideline. The dimensions and items analyzed in each tool were also analyzed. Results The initial search showed 3372 articles. Only 10 finally met the inclusion criteria and were chosen for analysis in this review, analyzing 8 measurement tools. Of these tools, 4 validated ≥5 psychometric properties defined in the COSMIN guideline. Although some of the tools only measure the usability dimension, other tools provide information such as engagement, esthetics, or functionality. Furthermore, 2 measurement tools, Mobile App Rating Scale and mHealth Apps Usability Questionnaire, have a user version, as well as a professional version. Conclusions The Health Information Technology Usability Evaluation Scale and the Measurement Scales for Perceived Usefulness and Perceived Ease of Use were the most validated tools, but they were very focused on usability. The Mobile App Rating Scale showed a moderate number of validated psychometric properties, measures a significant number of quality dimensions, and has been validated in a large number of mHealth apps, and its use is widespread. It is suggested that the continuation of the validation of this tool in other psychometric properties could provide an appropriate option for evaluating the quality of mHealth apps.
The study’s purpose was to assess the reliability of the LIS2DH12 in two different positions, using the commercial sensor Actigraph GT9X as a reference instrument. Five participants completed two gait tests on a treadmill. Firstly, both sensors were worn on the wrist and around the thigh. Each test consisted of a 1 min walk for participants to become accustomed to the treadmill, followed by a 2 min trial at ten pre-set speeds. Data from both sensors were collected in real-time. Intraclass correlation coefficient (ICC) was used to evaluate the equality of characteristics obtained by both sensors: maximum peaks, minimum peaks, and the mean of the complete signal (sequence of acceleration values along the time) by each axis and speed were extracted to evaluate the equality of characteristics obtained with LIS2DH12 compared to Actigraph. Intraclass correlation coefficient (ICC) was extracted, and a standard deviation of the mean was obtained from the data. Our results show that LIS2DH12 measurements present more reliability than Actigraph GT9X, ICC > 0.8 at three axes. This study concludes that LIS2DH12 is as reliable and accurate as Actigraph GT9X Link and, therefore, would be a suitable tool for future kinematic studies.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.