Background There is an increasing interest in using routinely collected eHealth data to support reflective practice and long-term professional learning. Studies have evaluated the impact of dashboards on clinician decision-making, task completion time, user satisfaction, and adherence to clinical guidelines. Objective This scoping review aims to summarize the literature on dashboards based on patient administrative, medical, and surgical data for clinicians to support reflective practice. Methods A scoping review was conducted using the Arksey and O’Malley framework. A search was conducted in 5 electronic databases (MEDLINE, Embase, Scopus, ACM Digital Library, and Web of Science) to identify studies that met the inclusion criteria. Study selection and characterization were performed by 2 independent reviewers (BB and CP). One reviewer extracted the data that were analyzed descriptively to map the available evidence. Results A total of 18 dashboards from 8 countries were assessed. Purposes for the dashboards were designed for performance improvement (10/18, 56%), to support quality and safety initiatives (6/18, 33%), and management and operations (4/18, 22%). Data visualizations were primarily designed for team use (12/18, 67%) rather than individual clinicians (4/18, 22%). Evaluation methods varied among asking the clinicians directly (11/18, 61%), observing user behavior through clinical indicators and use log data (14/18, 78%), and usability testing (4/18, 22%). The studies reported high scores on standard usability questionnaires, favorable surveys, and interview feedback. Improvements to underlying clinical indicators were observed in 78% (7/9) of the studies, whereas 22% (2/9) of the studies reported no significant changes in performance. Conclusions This scoping review maps the current literature landscape on dashboards based on routinely collected clinical indicator data. Although there were common data visualization techniques and clinical indicators used across studies, there was diversity in the design of the dashboards and their evaluation. There was a lack of detail regarding the design processes documented for reproducibility. We identified a lack of interface features to support clinicians in making sense of and reflecting on their personal performance data.
Background Hospitals routinely collect large amounts of administrative data such as length of stay, 28-day readmissions, and hospital-acquired complications; yet, these data are underused for continuing professional development (CPD). First, these clinical indicators are rarely reviewed outside of existing quality and safety reporting. Second, many medical specialists view their CPD requirements as time-consuming, having minimal impact on practice change and improving patient outcomes. There is an opportunity to build new user interfaces based on these data, designed to support individual and group reflection. Data-informed reflective practice has the potential to generate new insights about performance, bridging the gap between CPD and clinical practice. Objective This study aims to understand why routinely collected administrative data have not yet become widely used to support reflective practice and lifelong learning. Methods We conducted semistructured interviews (N=19) with thought leaders from a range of backgrounds, including clinicians, surgeons, chief medical officers, information and communications technology professionals, informaticians, researchers, and leaders from related industries. Interviews were thematically analyzed by 2 independent coders. Results Respondents identified visibility of outcomes, peer comparison, group reflective discussions, and practice change as potential benefits. The key barriers included legacy technology, distrust with data quality, privacy, data misinterpretation, and team culture. Respondents suggested recruiting local champions for co-design, presenting data for understanding rather than information, coaching by specialty group leaders, and timely reflection linked to CPD as enablers to successful implementation. Conclusions Overall, there was consensus among thought leaders, bringing together insights from diverse backgrounds and medical jurisdictions. We found that clinicians are interested in repurposing administrative data for professional development despite concerns with underlying data quality, privacy, legacy technology, and visual presentation. They prefer group reflection led by supportive specialty group leaders, rather than individual reflection. Our findings provide novel insights into the specific benefits, barriers, and benefits of potential reflective practice interfaces based on these data sets. They can inform the design of new models of in-hospital reflection linked to the annual CPD planning-recording-reflection cycle.
BACKGROUND Hospitals routinely collect large amounts of administrative data such as length of stay, 28-day readmissions, and hospital-acquired complications; yet, these data are underused for continuing professional development (CPD). First, these clinical indicators are rarely reviewed outside of existing quality and safety reporting. Second, many medical specialists view their CPD requirements as time-consuming, having minimal impact on practice change and improving patient outcomes. There is an opportunity to build new user interfaces based on these data, designed to support individual and group reflection. Data-informed reflective practice has the potential to generate new insights about performance, bridging the gap between CPD and clinical practice. OBJECTIVE This study aims to understand why routinely collected administrative data have not yet become widely used to support reflective practice and lifelong learning. METHODS We conducted semistructured interviews (N=19) with thought leaders from a range of backgrounds, including clinicians, surgeons, chief medical officers, information and communications technology professionals, informaticians, researchers, and leaders from related industries. Interviews were thematically analyzed by 2 independent coders. RESULTS Respondents identified visibility of outcomes, peer comparison, group reflective discussions, and practice change as potential benefits. The key barriers included legacy technology, distrust with data quality, privacy, data misinterpretation, and team culture. Respondents suggested recruiting local champions for co-design, presenting data for understanding rather than information, coaching by specialty group leaders, and timely reflection linked to CPD as enablers to successful implementation. CONCLUSIONS Overall, there was consensus among thought leaders, bringing together insights from diverse backgrounds and medical jurisdictions. We found that clinicians are interested in repurposing administrative data for professional development despite concerns with underlying data quality, privacy, legacy technology, and visual presentation. They prefer group reflection led by supportive specialty group leaders, rather than individual reflection. Our findings provide novel insights into the specific benefits, barriers, and benefits of potential reflective practice interfaces based on these data sets. They can inform the design of new models of in-hospital reflection linked to the annual CPD planning-recording-reflection cycle.
BACKGROUND There is an increasing interest to use routinely collected electronic health data to support reflective practice and long-term professional learning. Studies have evaluated the impact of dashboards on clinician decision-making, task completion time, user satisfaction, and adherence to clinical guidelines. OBJECTIVE The scoping review will summarize the literature on dashboards based on patient administrative, medical, and surgical data for clinicians to support reflective practice. METHODS A scoping review was conducted using the Arksey and O’Malley framework. A search was conducted in five electronic databases (MEDLINE, EMBASE, Scopus, ACM Digital Library, Web of Science) to identify studies that meet the inclusion criteria. Study selection and characterization were performed by two independent reviewers. One reviewer extracted the data that was analyzed descriptively to map the available evidence. RESULTS A total of 18 dashboards from eight countries were assessed. Purposes for the dashboards were designed for performance improvement (n=10), to support quality and safety initiatives (n=6), and management and operations (n=4). Data visualizations were primarily designed for team use (n=12) rather than individual clinicians (n=4). Evaluation methods varied between asking the clinicians directly (n=11), observing user behavior through clinical indicator and usage log data (n=14), and usability testing (n=4). The studies reported high scores from standard usability questionnaires, favorable surveys, and interview feedback. Improvements to underlying clinical indicators were observed in seven of nine studies, while two studies reported no significant changes to performance. CONCLUSIONS This scoping review maps the current landscape of literature on dashboards based on routinely collected clinical indicator data. While there were common data visualization techniques and clinical indicators used across studies, there was diversity in the design of the dashboards and their evaluation. There was a lack of detail in design processes documented for reproducibility. We identified a lack of interface features to support clinicians to make sense of and reflect on their performance data for long-term professional learning.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.