Objectives Improving the usability of electronic health records (EHR) continues to be a focus of clinicians, vendors, researchers, and regulatory bodies. To understand the impact of usability redesign of an existing, site-configurable feature, we evaluated the user interface (UI) used to screen for depression, alcohol and drug misuse, fall risk, and the existence of advance directive information in ambulatory settings. Methods As part of a quality improvement project, based on heuristic analysis, the existing UI was redesigned. Using an iterative, user-centered design process, several usability defects were corrected. Summative usability testing was performed as part of the product development and implementation cycle. Clinical quality measures reflecting rolling 12-month rates of screening were examined over 8 months prior to the implementation of the redesigned UI and 9 months after implementation. Results Summative usability testing demonstrated improvements in task time, error rates, and System Usability Scale scores. Interrupted time series analysis demonstrated significant improvements in all screening rates after implementation of the redesigned UI compared with the original implementation. Conclusion User-centered redesign of an existing site-specific UI may lead to significant improvements in measures of usability and quality of patient care.
Background Provider prescribing practices contribute to an excess of opioid-related deaths in the United States. Clinical guidelines exist to assist providers with improving prescribing practices and promoting patient safety. Clinical decision support systems (CDSS) may promote adherence to these guidelines and improve prescribing practices. The aim of this project was to improve opioid guideline adherence, prescribing practices, and rates of opioid-related encounters through the implementation of an opioid CDSS. Methods A vendor-developed, provider-targeted CDSS package was implemented in a multi-location academic health center. An interrupted time-series analysis was performed, evaluating 30 weeks pre- and post-implementation time periods. Outcomes were derived from vendor-supplied key performance indicators and directly from the electronic health record (EHR) database. Opioid-prescribing outcomes included count of opioid prescriptions, morphine milligram equivalents per prescription, counts of opioids with concurrent benzodiazepines, and counts of short-acting opioids in opioid-naïve patients. Encounter outcomes included rates of encounters for opioid abuse and dependence and rates of encounters for opioid poisoning and overdose. Guideline adherence outcomes included rates of provision of naloxone and documentation of opioid treatment agreements. Results The opioid CDSS generated an average of 1,637 alerts per week. Rates of provision of naloxone and opioid treatment agreements improved after CDSS implementation. Vendor-supplied prescribing outcomes were consistent with prescribing outcomes derived directly from the EHR, but all prescribing and encounter outcomes were unchanged. Conclusion A vendor-developed, provider-targeted opioid CDSS did not improve opioid-prescribing practices or rates of opioid-related encounters. The CDSS improved some measures of provider adherence to opioid-prescribing guidelines. Further work is needed to determine the optimal configuration of opioid CDSS so that opioid-prescribing patterns are appropriately modified and encounter outcomes are improved.
Objectives Poor electronic health record (EHR) usability is associated with patient safety concerns, user dissatisfaction, and provider burnout. EHR certification requires vendors to perform user testing. However, there are no such requirements for site-specific implementations. Health care organizations customize EHR implementations, potentially introducing usability problems. Site-specific usability evaluations may help to identify these concerns, and “discount” usability methods afford health systems a means of doing so even without dedicated usability specialists. This report characterizes a site-specific discount user testing program launched at an academic medical center. We describe lessons learned and highlight three of the EHR features in detail to demonstrate the impact of testing on implementation decisions and on users. Methods Thirteen new EHR features which had already undergone heuristic evaluation and iterative design were evaluated over the course of three user test events. Each event included five to six users. Participants used think aloud technique. Measures of user efficiency, effectiveness, and satisfaction were collected. Usability concerns were characterized by the type of usability heuristic violated and by correctability. Results Usability concerns occurred at a rate of 2.5 per feature tested. Seventy percent of the usability concerns were deemed correctable prior to implementation. The first highlighted feature was moved to production despite low single ease question (SEQ) scores which may have predicted its subsequent withdrawal from production based on post implementation feedback. Another feature was rebuilt based on usability findings, and a new version was retested and moved to production. A third feature highlights an easily correctable usability concern identified in user testing. Quantitative usability metrics generally reinforced qualitative findings. Conclusion Simplified user testing with a limited number of participants identifies correctable usability concerns, even after heuristic evaluation. Our discount usability approach to site-specific usability has a role in implementations and may improve the usability of the EHR for the end user.
Introduction Unnecessary and inappropriate laboratory testing contributes to increased health care costs, increases length of stay, and increases odds for blood product transfusion. The Choosing Wisely campaign recommends a judicious use of laboratory blood testing to combat iatrogenic anemia. Reducing the number of duplicate test orders may help address these issues. We evaluated duplicate order alert thresholds in our electronic health record for 10 common laboratory tests at an academic medical center. Methods In January 2019, alert intervals for 10 common inpatient laboratory tests (thyroid stimulating hormone, complete blood count, hemoglobin A1c, troponin, lactic acid, hemoglobin and hematocrit, urinalysis, vitamin D, urine beta HCG, and triglycerides) were adjusted to evidence-based, disease-specific thresholds. If a test was ordered within a timeframe shorter than this threshold, an alert interrupted the provider’s workflow. The provider was allowed to override the alert based on clinical judgment. This is a change from the previous settings, which alerted any test if ordered more frequently than 8 hours. Postintervention duplicate order alerts were compared to baseline rates and adjusted for number of inpatient discharges. Results In total, 914 orders were cancelled in 1 month as a result of tailored duplicate order alerts versus the baseline mean of 710 (95% CI, 633-786) and a predicted 552 (95% CI, 475-628) when adjusted for number of inpatient discharges, with the majority of cancelled orders being for CBC (530 accepted alerts). Overall, this reduction in unnecessary duplicate tests is equivalent to 3,092 mL of blood not collected from patients per month. Conclusion Tailoring duplicate order alert interval thresholds to evidence-based criteria helps reduce unnecessary testing, reduces costs, and may play an important role in reducing hospital-acquired anemia.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.