Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
Background The field of eHealth is growing rapidly and chaotically. Health care professionals need guidance on reviewing and assessing health-related smartphone apps to propose appropriate ones to their patients. However, to date, no framework or evaluation tool fulfills this purpose. Objective Before developing a tool to help health care professionals assess and recommend apps to their patients, we aimed to create an overview of published criteria to describe and evaluate health apps. Methods We conducted a systematic review to identify existing criteria for eHealth smartphone app evaluation. Relevant databases and trial registers were queried for articles. Articles were included that (1) described tools, guidelines, dimensions, or criteria to evaluate apps, (2) were available in full text, and (3) were written in English, French, German, Italian, Portuguese, or Spanish. We proposed a conceptual framework for app evaluation based on the dimensions reported in the selected articles. This was revised iteratively in discussion rounds with international stakeholders. The conceptual framework was used to synthesize the reported evaluation criteria. The list of criteria was discussed and refined by the research team. Results Screening of 1258 articles yielded 128 (10.17%) that met the inclusion criteria. Of these 128 articles, 30 (23.4%) reported the use of self-developed criteria and described their development processes incompletely. Although 43 evaluation instruments were used only once, 6 were used in multiple studies. Most articles (83/128, 64.8%) did not report following theoretical guidelines; those that did noted 37 theoretical frameworks. On the basis of the selected articles, we proposed a conceptual framework to explore 6 app evaluation dimensions: context, stakeholder involvement, features and requirements, development processes, implementation, and evaluation. After standardizing the definitions, we identified 205 distinct criteria. Through consensus, the research team relabeled 12 of these and added 11 more—mainly related to ethical, legal, and social aspects—resulting in 216 evaluation criteria. No criteria had to be moved between dimensions. Conclusions This study provides a comprehensive overview of criteria currently used in clinical practice to describe and evaluate apps. This is necessary as no reviewed criteria sets were inclusive, and none included consistent definitions and terminology. Although the resulting overview is impractical for use in clinical practice in its current form, it confirms the need to craft it into a purpose-built, theory-driven tool. Therefore, in a subsequent step, based on our current criteria set, we plan to construct an app evaluation tool with 2 parts: a short section (including 1-3 questions/dimension) to quickly disqualify clearly unsuitable apps and a longer one to investigate more likely candidates in closer detail. We will use a Delphi consensus-building process and develop a user manual to prepare for this undertaking. Trial Registration PROSPERO International Prospective Register of Systematic Reviews CRD42021227064; https://www.crd.york.ac.uk/prospero/display_record.php?ID=CRD42021227064
Background The field of eHealth is growing rapidly and chaotically. Health care professionals need guidance on reviewing and assessing health-related smartphone apps to propose appropriate ones to their patients. However, to date, no framework or evaluation tool fulfills this purpose. Objective Before developing a tool to help health care professionals assess and recommend apps to their patients, we aimed to create an overview of published criteria to describe and evaluate health apps. Methods We conducted a systematic review to identify existing criteria for eHealth smartphone app evaluation. Relevant databases and trial registers were queried for articles. Articles were included that (1) described tools, guidelines, dimensions, or criteria to evaluate apps, (2) were available in full text, and (3) were written in English, French, German, Italian, Portuguese, or Spanish. We proposed a conceptual framework for app evaluation based on the dimensions reported in the selected articles. This was revised iteratively in discussion rounds with international stakeholders. The conceptual framework was used to synthesize the reported evaluation criteria. The list of criteria was discussed and refined by the research team. Results Screening of 1258 articles yielded 128 (10.17%) that met the inclusion criteria. Of these 128 articles, 30 (23.4%) reported the use of self-developed criteria and described their development processes incompletely. Although 43 evaluation instruments were used only once, 6 were used in multiple studies. Most articles (83/128, 64.8%) did not report following theoretical guidelines; those that did noted 37 theoretical frameworks. On the basis of the selected articles, we proposed a conceptual framework to explore 6 app evaluation dimensions: context, stakeholder involvement, features and requirements, development processes, implementation, and evaluation. After standardizing the definitions, we identified 205 distinct criteria. Through consensus, the research team relabeled 12 of these and added 11 more—mainly related to ethical, legal, and social aspects—resulting in 216 evaluation criteria. No criteria had to be moved between dimensions. Conclusions This study provides a comprehensive overview of criteria currently used in clinical practice to describe and evaluate apps. This is necessary as no reviewed criteria sets were inclusive, and none included consistent definitions and terminology. Although the resulting overview is impractical for use in clinical practice in its current form, it confirms the need to craft it into a purpose-built, theory-driven tool. Therefore, in a subsequent step, based on our current criteria set, we plan to construct an app evaluation tool with 2 parts: a short section (including 1-3 questions/dimension) to quickly disqualify clearly unsuitable apps and a longer one to investigate more likely candidates in closer detail. We will use a Delphi consensus-building process and develop a user manual to prepare for this undertaking. Trial Registration PROSPERO International Prospective Register of Systematic Reviews CRD42021227064; https://www.crd.york.ac.uk/prospero/display_record.php?ID=CRD42021227064
Background Advances in digital health and digital learning are transforming the lives of patients, health care providers, and health professional students. In the interdisciplinary field of communication sciences and disorders (CSD), digital uptake and incorporation of digital topics and technologies into clinical training programs has lagged behind other medical fields. There is a need to understand professional and student experiences, opinions, and needs regarding digital health and learning topics so that effective strategies for implementation can be optimized. Objective This cross-sectional survey study aims to interdisciplinarily investigate professional and student knowledge, use, attitudes, and preferences toward digital health and learning in the German-speaking population. Methods An open-ended, web-based survey was developed and conducted with professionals and students in CSD including phoniatricians and otolaryngologists, speech-language pathologists (German: Logopäd*innen), medical students, and speech-language pathology students. Differences in knowledge, use, attitudes, and preferences across profession, generation, and years of experience were analyzed. Results A total of 170 participants completed the survey. Respondents demonstrated greater familiarity with digital learning as opposed to eHealth concepts. Significant differences were noted across profession (P<.001), generation (P=.001), and years of experience (P<.001), which demonstrated that students and younger participants were less familiar with digital health terminology. Professional (P<.001) and generational differences were also found (P=.04) in knowledge of digital therapy tools, though no significant differences were found for digital learning tools. Participants primarily used computers, tablets, and mobile phones; non–eHealth-specific tools (eg, word processing and videoconferencing applications); and digital formats such as videos, web courses, and apps. Many indicated a desire for more interactive platforms, such as virtual reality. Significant differences were found across generations for positive views toward digitalization (P<.001) and across profession for feelings of preparedness (P=.04). Interestingly, across profession (P=.03), generation (P=.006), and years of experience (P=.01), students and younger participants demonstrated greater support for medical certification. Commonly reported areas of concern included technical difficulties, quality and validity of digital materials, data privacy, and social presence. Respondents tended to prefer blended learning, a limited to moderate level of interactivity, and time and space–flexible learning environments (63/170, 37.1%), with a notable proportion still preferring traditional time and space–dependent learning (49/170, 28.8%). Conclusions This comprehensive investigation into the current state of CSD student and professional opinions and experiences has shown that incorporation of digital topics and skills into academic and professional development curricula will be crucial for ensuring that the field is prepared for the ever-digitalizing health care environment. Deeper empirical investigation into efficacy and acceptance of digital learning and practice strategies and systematic training and practical organizational supports must be planned to ensure adaptive education and practice.
BACKGROUND Advances in digital health and digital learning are transforming the lives of patients, health care providers, and health professional students. In the interdisciplinary field of communication sciences and disorders (CSD), digital uptake and incorporation of digital topics and technologies into clinical training programs has lagged behind other medical fields. There is a need to understand professional and student experiences, opinions, and needs regarding digital health and learning topics so that effective strategies for implementation can be optimized. OBJECTIVE This cross-sectional survey study aims to interdisciplinarily investigate professional and student knowledge, use, attitudes, and preferences toward digital health and learning in the German-speaking population. METHODS An open-ended, web-based survey was developed and conducted with professionals and students in CSD including phoniatricians and otolaryngologists, speech-language pathologists (<i>German: Logopäd*innen</i>), medical students, and speech-language pathology students. Differences in knowledge, use, attitudes, and preferences across profession, generation, and years of experience were analyzed. RESULTS A total of 170 participants completed the survey. Respondents demonstrated greater familiarity with digital learning as opposed to eHealth concepts. Significant differences were noted across profession (<i>P<</i>.001), generation (<i>P=</i>.001), and years of experience (<i>P<</i>.001), which demonstrated that students and younger participants were less familiar with digital health terminology. Professional (<i>P<</i>.001) and generational differences were also found (<i>P=</i>.04) in knowledge of digital therapy tools, though no significant differences were found for digital learning tools. Participants primarily used computers, tablets, and mobile phones; non–eHealth-specific tools (eg, word processing and videoconferencing applications); and digital formats such as videos, web courses, and apps. Many indicated a desire for more interactive platforms, such as virtual reality. Significant differences were found across generations for positive views toward digitalization (<i>P<</i>.001) and across profession for feelings of preparedness (<i>P=</i>.04). Interestingly, across profession (<i>P=</i>.03), generation (<i>P=</i>.006), and years of experience (<i>P=</i>.01), students and younger participants demonstrated greater support for medical certification. Commonly reported areas of concern included technical difficulties, quality and validity of digital materials, data privacy, and social presence. Respondents tended to prefer blended learning, a limited to moderate level of interactivity, and time and space–flexible learning environments (63/170, 37.1%), with a notable proportion still preferring traditional time and space–dependent learning (49/170, 28.8%). CONCLUSIONS This comprehensive investigation into the current state of CSD student and professional opinions and experiences has shown that incorporation of digital topics and skills into academic and professional development curricula will be crucial for ensuring that the field is prepared for the ever-digitalizing health care environment. Deeper empirical investigation into efficacy and acceptance of digital learning and practice strategies and systematic training and practical organizational supports must be planned to ensure adaptive education and practice. CLINICALTRIAL
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.