Background: Efforts to engage patients as partners in health research have grown and thereby the need for feedback and evaluation. In this pilot evaluation study, we aimed to 1) evaluate patient engagement in health research projects in Newfoundland and Labrador, Canada, and 2) learn more about how to best monitor and evaluate patient engagement. This paper presents the results of our participatory evaluation study and the lessons learned. The evaluation of the projects was driven by questions patients wanted answered. Methods: We conducted a formative evaluation of patient engagement in health research projects. Projects spanned a variety of topics, target groups, research designs and methods of patient engagement. Participants included principal investigators (n = 6) and their patient partners (n = 14). Furthermore, graduate students (n = 13) working on their own research projects participated. Participants completed an online survey with closed and open-ended questions about their patient engagement efforts, experiences and preliminary outcomes. Patients were involved as co-investigators in the entire evaluation study. We used qualitative methods to evaluate our participatory process. Results: The evaluation study results show that most patients and researchers felt prepared and worked together in various phases of the research process. Both groups felt that the insights and comments of patients influenced research decisions. They believed that patient engagement improved the quality and uptake of research. Students felt less prepared and were less satisfied with their patient engagement experience compared to researchers and their patient partners. Involvement of patient co-investigators in this evaluation resulted in learnings, transparency, validation of findings and increased applicability. Challenges were to select evaluation questions relevant to all stakeholders and to adapt evaluation tools to local needs.
With opioid use at crisis levels, it is imperative to support youth ages with opioid use disorders (OUD) in taking medication and accessing behavioral services over long periods. This article presents a conceptual framework for telehealth strategies that can be adopted to increase family involvement across a four‐stage continuum of youth OUD treatment and recovery: Treatment Preparation, Treatment Initiation, Treatment Stabilization, OUD Recovery. It first identifies provider‐delivered tele‐interventions that can enhance OUD services in each of the four stages, including family outreach, family engagement, family‐focused intervention, and family‐focused recovery maintenance. It then introduces several types of direct‐to‐family tele‐supports that can be used to supplement provider‐delivered interventions. These include both synchronous tele‐supports (remote interactions that occur in real time) such as helplines, peer‐to‐peer coaching, and online support groups; and asynchronous tele‐supports (communications that occur without participants being simultaneously present) such as automated text messaging, self‐directed internet‐based courses, and digital web support.
A foundational strategy to promote implementation of evidence-based interventions (EBIs) is providing EBI training to therapists. This study tested an online training system in which therapists practiced observational coding of mock video vignettes demonstrating family therapy techniques for adolescent behavior problems. The study compared therapists ratings to gold-standard scores to measure therapist reliability (consistency across vignettes) and accuracy (approximation to gold scores); tested whether reliability and accuracy improved during training; and tested therapist-level predictors of overall accuracy and change in accuracy over time. Participants were 48 therapists working in nine community behavioral health clinics. The 32-exercise training course provided online instruction (about 15 min/week) in 13 core family therapy techniques representing three modules: Family Engagement, Relational Orientation, Interactional Change. Therapist reliability in rating technique presence (i.e., technique recognition) remained moderate across training; reliability in rating extensiveness of technique delivery (i.e., technique judgment) improved sharply over time, from poor to good. Whereas therapists on average overestimated extensiveness for almost every technique, their tendency to give low-accuracy scores decreased. Therapist accuracy improved significantly over time only for Interactional Change techniques. Baseline digital literacy and submission of self-report checklists on use of the techniques in their own sessions predicted coding accuracy. Training therapists to be more reliable and accurate coders of EBI techniques can potentially yield benefits in increased EBI self-report acumen and EBI use in daily practice. However, training effects may need to improve from those reported here to avail meaningful impact on EBI implementation. Trial Registration: The parent clinical trial is registered at www. Clini calTr ials. gov, ID: NCT03342872 (registration date: 11.10.17).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.