BACKGROUND: On July 1, 2018, the Veterans Health Administration (VA) National Center for Ethics in Health Care implemented the Life-Sustaining Treatment Decisions Initiative (LSTDI). Its goal is to identify, document, and honor LST decisions of seriously ill veterans. Providers document veterans' goals and decisions using a standardized LST template and order set. OBJECTIVE: Evaluate the first 7 months of LSTDI implementation and identify predictors of LST template completion. DESIGN: Retrospective observational study of clinical and administrative data. We identified all completed LST templates, defined as completion of four required template fields. Templates also include four non-required fields. Results were stratified by risk of hospitalization or death as estimated by the Care Assessment Need (CAN) score.
Background: User-centered design (UCD) methods are well-established techniques for creating useful artifacts, but few studies illustrate their application to clinical feedback reports. When used as an implementation strategy, the content of feedback reports depends on a foundational audit process involving performance measures and data, but these important relationships have not been adequately described. Better guidance on UCD methods for designing feedback reports is needed. Our objective is to describe the feedback report design method for refining the content of prototype reports. Methods: We propose a three-step feedback report design method (refinement of measures, data, and display). The three steps follow dependencies such that refinement of measures can require changes to data, which in turn may require changes to the display. We believe this method can be used effectively with a broad range of UCD techniques. Results: We illustrate the three-step method as used in implementation of goals of care conversations in long-term care settings in the U.S. Veterans Health Administration. Using iterative usability testing, feedback report content evolved over cycles of the three steps. Following the steps in the proposed method through 12 iterations with 13 participants, we improved the usability of the feedback reports. Conclusions: UCD methods can improve feedback report content through an iterative process. When designing feedback reports, refining measures, data, and display may enable report designers to improve the user centeredness of feedback reports.
Objective: To evaluate the effectiveness of feedback reports and feedback reports + external facilitation on completion of life-sustaining treatment (LST) note the template and durable medical orders. This quality improvement program supported the national roll-out of the Veterans Health Administration (VA) LST Decisions Initiative (LSTDI), which aims to ensure that seriously-ill veterans have care goals and LST decisions elicited and documented. Data Sources: Primary data from national databases for VA nursing homes (called Community Living Centers [CLCs]) from 2018 to 2020.Study Design: In one project, we distributed monthly feedback reports summarizing LST template completion rates to 12 sites as the sole implementation strategy. In the second involving five sites, we distributed similar feedback reports and provided robust external facilitation, which included coaching, education, and learning collaboratives. For each project, principal component analyses matched intervention to comparison sites, and interrupted time series/segmented regression analyses evaluated the differences in LSTDI template completion rates between intervention and comparison sites.Data Collection Methods: Data were extracted from national databases in addition to interviews and surveys in a mixed-methods process evaluation.Principal Findings: LSTDI template completion rose from 0% to about 80% throughout the study period in both projects' intervention and comparison CLCs. There were small but statistically significant differences for feedback reports alone (comparison sites performed better, coefficient estimate 3.48, standard error 0.99 for the difference between groups in change in trend) and feedback reports + external facilitation (intervention sites performed better, coefficient estimate À2.38, standard error 0.72).Conclusions: Feedback reports + external facilitation was associated with a small but statistically significant improvement in outcomes compared with comparison sites.
Background Implementation researchers recognize the influential role of organizational factors and, thus, seek to assess these factors using quantitative measurement instruments. However, researchers are hindered by instruments that measure similar constructs but rely on different nomenclature and/or definitions. The Consolidated Framework for Implementation Research (CFIR) provides a taxonomy of constructs derived from prior frameworks and empirical studies of implementation-related constructs. The CFIR includes constructs based on the original Promoting Action on Research Implementation in Health Services (PARiHS) framework which highlights the key roles of strength of evidence for a specific evidence-based intervention (EBI), favorability of organizational context for change, and capacities to facilitate implementation of the EBI. Although the CFIR is among the most frequently used implementation frameworks, it does not include quantitative measures. The Organizational Resource and Context Assessment (ORCA) is a quantitative measurement instrument that was developed based on PARiHS, assessing its three domains. Factors within these three domains are conceptually similar to constructs in the CFIR but do not match directly. The aim of this work was to map ORCA survey items to CFIR constructs to enable direct comparisons and syntheses of findings across studies using the CFIR and/or ORCA. Methods Two distinct, independent research teams, each used rigorous constant comparative techniques with deliberation and consensus to map individual items from the ORCA to the five domains and 39 constructs of CFIR. Results ORCA items were mapped primarily to three of five CFIR domains: Inner Setting, Process, and Intervention Characteristics. The two research teams agreed on 88% of mappings at the higher domain level; at the lower construct level, their mappings aligned for 62.2% of the ORCA items. Conclusions Mapping results reveal that the ORCA focuses measurement prominently on Inner Setting, Process, and Intervention Characteristics. This mapping guide can help improve consistency in measurement and reporting, enabling more efficient comparison and synthesis of findings that use either the ORCA instrument or the CFIR framework. The guide helps advance implementation science utilizing mixed methods by providing CFIR users with quantitative measures for selected constructs and enables ORCA users to map their findings to CFIR constructs.
Background Process mapping is often used in quality improvement work to examine current processes and workflow and to identify areas to intervene to improve quality. Our objective in this paper is to describe process maps as a visual means of understanding modifiable behaviors and activities, in this case example to ensure that goals of care conversations are part of admitting a veteran in long-term care settings. Methods We completed site visits to 6 VA nursing homes and reviewed their current admission processes. We conducted interviews to document behaviors and activities that occur when a veteran is referred to a long-term care setting, during admission, and during mandatory VA reassessments. We created visualizations of the data using process mapping approaches. Process maps for each site were created to document the admission activities for each VA nursing home and were reviewed by the research team to identify consistencies across sites and to identify potential opportunities for implementing goals of care conversations. Results We identified five consistent behaviors that take place when a veteran is referred and admitted in long-term care. These behaviors are assessing, discussing, decision-making, documenting, and re-assessing. Conclusions Based on the process maps, it seems feasible that the LST note and order template could be completed along with other routine assessment processes. However, this will require more robust multi-disciplinary collaboration among both prescribing and non-prescribing health care providers. Completing the LST template during the current admission process would increase the likelihood that the template is completed in a timely manner, potentially alleviate the perceived time burden, and help with the provision of veteran-centered care.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.