Background Implementation strategies have flourished in an effort to increase integration of research evidence into clinical practice. Most strategies are complex, socially mediated processes. Many are complicated, expensive, and ultimately impractical to deliver in real-world settings. The field lacks methods to assess the extent to which strategies are usable and aligned with the needs and constraints of the individuals and contexts who will deliver or receive them. Drawn from the field of human-centered design, cognitive walkthroughs are an efficient assessment method with potential to identify aspects of strategies that may inhibit their usability and, ultimately, effectiveness. This article presents a novel walkthrough methodology for evaluating strategy usability as well as an example application to a post-training consultation strategy to support school mental health clinicians to adopt measurement-based care. Method The Cognitive Walkthrough for Implementation Strategies (CWIS) is a pragmatic, mixed-methods approach for evaluating complex, socially mediated implementation strategies. CWIS includes six steps: (1) determine preconditions; (2) hierarchical task analysis; (3) task prioritization; (4) convert tasks to scenarios; (5) pragmatic group testing; and (6) usability issue identification, classification, and prioritization. A facilitator conducted two group testing sessions with clinician users (N = 10), guiding participants through 6 scenarios and 11 associated subtasks. Clinicians reported their anticipated likelihood of completing each subtask and provided qualitative justifications during group discussion. Following the walkthrough sessions, users completed an adapted quantitative assessment of strategy usability. Results Average anticipated success ratings indicated substantial variability across participants and subtasks. Usability ratings (scale 0–100) of the consultation protocol averaged 71.3 (SD = 10.6). Twenty-one usability problems were identified via qualitative content analysis with consensus coding, and classified by severity and problem type. High-severity problems included potential misalignment between consultation and clinical service timelines as well as digressions during consultation processes. Conclusions CWIS quantitative usability ratings indicated that the consultation protocol was at the low end of the “acceptable” range (based on norms from the unadapted scale). Collectively, the 21 resulting usability issues explained the quantitative usability data and provided specific direction for usability enhancements. The current study provides preliminary evidence for the utility of CWIS to assess strategy usability and generate a blueprint for redesign.
Background Technology-enabled services (TESs), which integrate human service and digital components, are popular strategies to increase the reach and impact of mental health interventions, but large-scale implementation of TESs has lagged behind their potential. Objective This study applied a mixed qualitative and quantitative approach to gather input from multiple key user groups (students and educators) and to understand the factors that support successful implementation (implementation determinants) and implementation outcomes of a TES for universal screening, ongoing monitoring, and support for suicide risk management in the school setting. Methods A total of 111 students in the 9th to 12th grade completed measures regarding implementation outcomes (acceptability, feasibility, and appropriateness) via an open-ended survey. A total of 9 school personnel (school-based mental health clinicians, nurses, and administrators) completed laboratory-based usability testing of a dashboard tracking the suicide risk of students, quantitative measures, and qualitative interviews to understand key implementation outcomes and determinants. School personnel were presented with a series of scenarios and common tasks focused on the basic features and functions of the dashboard. Directed content analysis based on the Consolidated Framework for Implementation Research was used to extract multilevel determinants (ie, the barriers or facilitators at the levels of the outer setting, inner setting, individuals, intervention, and implementation process) related to positive implementation outcomes of the TES. Results Overarching themes related to implementation determinants and outcomes suggest that both student and school personnel users view TESs for suicide prevention as moderately feasible and acceptable based on the Acceptability of Intervention Measure and Feasibility of Intervention Measure and as needing improvements in usability based on the System Usability Scale. Qualitative results suggest that students and school personnel view passive data collection based on social media data as a relative advantage to the current system; however, the findings indicate that the TES and the school setting need to address issues of privacy, integration into existing workflows and communication patterns, and options for individualization for student-centered care. Conclusions Innovative suicide prevention strategies that rely on passive data collection in the school context are a promising and appealing idea. Usability testing identified key issues for revision to facilitate widespread implementation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.