Background With the increased usage of dashboard reporting systems to monitor and track patient panels by clinical users, developers must ensure that the information displays they produce are accurate and intuitive. When evaluating usability of a clinical dashboard among potential end users, developers oftentimes rely on methods such as questionnaires as opposed to other, more time-intensive strategies that incorporate direct observation. Objectives Prior to release of the potentially inappropriate medication (PIM) clinical dashboard, designed to facilitate completion of a quality improvement project by clinician scholars enrolled in the Veterans Affairs (VA) workforce development Geriatric Scholars Program (GSP), we evaluated the usability of the system. This article describes the process of usability testing a dashboard reporting system with clinicians using direct observation and think-aloud moderating techniques. Methods We developed a structured interview protocol that combines virtual observation, think-aloud moderating techniques, and retrospective questioning of the overall user experience, including use of the System Usability Scale (SUS). Thematic analysis was used to analyze field notes from the interviews of three GSP alumni. Results Our structured approach to usability testing identified specific functional problems with the dashboard reporting system that were missed by results from the SUS. Usability testing lead to overall improvements in the intuitive use of the system, increased data transparency, and clarification of the dashboard's purpose. Conclusion Reliance solely on questionnaires and surveys at the end stages of dashboard development can mask potential functional problems that will impede proper usage and lead to misinterpretation of results. A structured approach to usability testing in the developmental phase is an important tool for developers of clinician friendly systems for displaying easily digested information and tracking outcomes for the purpose of quality improvement.
BackgroundDespite the prevalence of medical interpreting in the clinical environment, few medical professionals receive training in best practices when using an interpreter. We designed and implemented an educational workshop on using interpreters as part of the cultural competency curriculum for second year medical students (MSIIs) at David Geffen School of Medicine at UCLA. The purpose of this study is two-fold: first, to evaluate the effectiveness of the workshop and second, if deficiencies are found, to investigate whether the deficiencies affected the quality of the patient encounter when using an interpreter.MethodsA total of 152 MSIIs completed the 3-hour workshop and a 1-station objective-structured clinical examination, 8 weeks later to assess skills. Descriptive statistics and independent sample t-tests were used to assess workshop effectiveness.ResultsBased on a passing score of 70%, 39.4% of the class failed. Two skills seemed particularly problematic: assuring confidentiality (missed by 50%) and positioning the interpreter (missed by 70%). While addressing confidentiality did not have a significant impact on standardized patient satisfaction, interpreter position did.ConclusionInstructing the interpreter to sit behind the patient helps sustain eye contact between clinician and patient, while assuring confidentiality is a tenet of quality clinical encounters. Teaching students and faculty to emphasize both is warranted to improve cross-language clinical encounters.
Caregivers play an important role in the in-home care of community dwelling older adults living with Alzheimer's disease or related dementias (ADRD); however, many of these caregivers lack training in caring for this vulnerable population. In 2015, we developed and implemented an interactive, community-based, knowledge and skills-based training program for In-Home Supportive Services (IHSS) caregivers. This report shares the results of a process evaluation of this training program as it evolved over the course of three training sessions in Riverside County, California. Our iterative evaluation process reveals the unique needs of training and assessing a population of demographically diverse adult learners and provides guidance for those planning to implement similar training in underserved communities. Factors such as reliance on self-reported abilities, language readability level, and test anxiety may have confounded attempts to capture learner feedback and actual knowledge gains from our caregiver training program.
Background Involving clinician end users in the development process of clinical dashboards is important to ensure that user needs are adequately met prior to releasing the dashboard for use. The challenge with following this approach is that clinician end users can undergo periodic turnover, meaning, the clinicians that played a role in the initial development process may not be the same individuals that use the dashboard in future. Objectives Here, we summarize our Plan, Do, Study, Act (PDSA)-guided clinical dashboard development process for the VA Geriatric Scholars Program (GSP) and the value of continuous, iterative development. We summarize dashboard adaptations that resulted from two PDSA cycles of improvement for the potentially inappropriate medication dashboard (PIMD), one of many Geriatric Scholars clinical dashboards. We also present the evaluative performance of the PIMD. Methods Evaluation of the PIMD was performed using the system usability scale (SUS) and through review of user interaction logs. Routine end users that were Geriatric Scholars and had evidence of 5 or more dashboard views were invited to complete an electronic form that contained the 10-item SUS. Results The proportion of Geriatric Scholars that utilized the PIMD increased for each iterative dashboard version that was produced as a byproduct from feedback (31.0% in 2017 to 60.2% in 2019). The overall usability of the PIMD among routine users was found to be above average (SUS score: 75.2 [95% CI 70.5–79.8]) in comparison to the recommended standard of acceptability (SUS score: 68) Conclusion The solicitation of feedback during dashboard orientations led to iterative adaptations of the PIMD that broadened its intended use. The presented PDSA-guided process to clinical dashboard development for the VA GSP can serve as a valuable framework for development teams seeking to produce well-adopted and usable health information technology (IT) innovations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.