BackgroundBoth clinicians and patients use medical mobile phone apps. Anyone can publish medical apps, which leads to contents with variable quality that may have a serious impact on human lives. We herein provide an overview of the prevalence of expert involvement in app development and whether or not app contents adhere to current medical evidence.ObjectiveTo systematically review studies evaluating expert involvement or adherence of app content to medical evidence in medical mobile phone apps.MethodsWe systematically searched 3 databases (PubMed, The Cochrane Library, and EMBASE), and included studies evaluating expert involvement or adherence of app content to medical evidence in medical mobile phone apps. Two authors performed data extraction independently. Qualitative analysis of the included studies was performed.ResultsBased on inclusion criteria, 52 studies were included in this review. These studies assessed a total of 6520 apps. Studies dealt with a variety of medical specialties and topics. As much as 28 studies assessed expert involvement, which was found in 9-67% of the assessed apps. Thirty studies (including 6 studies that also assessed expert involvement) assessed adherence of app content to current medical evidence. Thirteen studies found that 10-87% of the assessed apps adhered fully to the compared evidence (published studies, recommendations, and guidelines). Seventeen studies found that none of the assessed apps (n=2237) adhered fully to the compared evidence.ConclusionsMost medical mobile phone apps lack expert involvement and do not adhere to relevant medical evidence.
BackgroundElectrocardiogram (ECG) interpretation is of great importance for patient management. However, medical students frequently lack proficiency in ECG interpretation and rate their ECG training as inadequate.Our aim was to examine the effect of a standalone web-based ECG tutorial and to assess the retention of skills using multiple follow-up intervals.Methods203 medical students were included in the study. All participants completed a pre-test, an ECG tutorial, and a post-test. The participants were also randomised to complete a retention-test after short (2–4 weeks), medium (10–12 weeks), or long (18–20 weeks) follow-up.Intragroup comparisons of test scores were done using paired-samples t-test. Intergroup comparisons of test scores were performed using independent-samples t-test and ANOVA, whereas demographic data were compared using ANOVA and Chi-squared test.ResultsThe overall mean test score improved significantly from 52.7 (SD 16.8) in the pre-test to 68.4 (SD 12.3) in the post-test (p < 0.001). Junior and senior students demonstrated significantly different baseline scores (45.5 vs. 57.8 points; p < 0.001), but showed comparable score gains (16.5 and 15.1 points, respectively; p = 0.48).All three follow-up groups experienced a decrease in test score between post-test and retention-test: from 67.4 (SD 12.3) to 60.2 (SD 8.3) in the short follow-up group, from 71.4 (SD 12.0) to 60.8 (SD 8.9) in the medium follow-up group, and from 66.1 (SD 12.1) to 58.6 (SD 8.6) in the long follow-up group (p < 0.001 for all). However, there were no significant differences in mean retention-test score between the groups (p = 0.33). Both junior and senior students showed a decline in test score at follow-up (from 62.0 (SD 10.6) to 56.2 (SD 9.8) and from 72.9 (SD 11.4) to 62.5 (SD 6.6), respectively). When comparing the pre-test to retention-test delta scores, junior students had learned significantly more than senior students (junior students improved 10.7 points and senior students improved 4.7 points, p = 0.003).ConclusionA standalone web-based ECG tutorial can be an effective means of teaching ECG interpretation skills to medical students. The newly acquired skills are, however, rapidly lost when the intervention is not repeated.
This is an open access article under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made.
This study led to the development of a theoretical test on VATS lobectomy consisting of multiple-choice questions. Both content and construct validity evidence were established.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.