BackgroundThis systematic review assessed the effectiveness of capacity building interventions relevant to public health practice. The aim is to inform and improve capacity building interventions.MethodsFour strategies were used: 1) electronic database searching; 2) reference lists of included papers; 3) key informant consultation; and 4) grey literature searching. Inclusion (e.g., published in English) and exclusion criteria (e.g., non-English language papers published earlier than 2005) are outlined with included papers focusing on capacity building, learning plans, or professional development plans within public health and related settings, such as non-governmental organizations, government, or community-based organizations relating to public health or healthcare. Outcomes of interest included changes in knowledge, skill or confidence (self-efficacy), changes in practice (application or intent), and perceived support or supportive environments, with outcomes reported at the individual, organizational or systems level(s). Quality assessment of all included papers was completed.ResultsFourteen papers were included in this review. These papers reported on six intervention types: 1) internet-based instruction, 2) training and workshops, 3) technical assistance, 4) education using self-directed learning, 5) communities of practice, and 6) multi-strategy interventions. The available literature showed improvements in one or more capacity-building outcomes of interest, mainly in terms of individual-level outcomes. The available literature was moderate in quality and showed a range of methodological issues.ConclusionsThere is evidence to inform capacity building programming and how interventions can be selected to optimize impact. Organizations should carefully consider methods for analysis of capacity building interventions offered; specifically, through which mechanisms, to whom, and for which purpose. Capacity-building interventions can enhance knowledge, skill, self-efficacy (including confidence), changes in practice or policies, behaviour change, application, and system-level capacity. However in applying available evidence, organizations should consider the outcomes of highest priority, selecting intervention(s) effective for the outcome(s) of interest. Examples are given for selecting intervention(s) to match priorities and context, knowing effectiveness evidence is only one consideration in decision making. Future evaluations should: extend beyond the individual level, assess outcomes at organizational and systems levels, include objective measures of effect, assess baseline conditions, and evaluate features most critical to the success of interventions.Electronic supplementary materialThe online version of this article (10.1186/s12889-018-5591-6) contains supplementary material, which is available to authorized users.
BackgroundThere is increasing awareness that regardless of the proven value of clinical interventions, the use of effective strategies to implement such interventions into clinical practice is necessary to ensure that patients receive the benefits. However, there is often confusion between what is the clinical intervention and what is the implementation intervention. This may be caused by a lack of conceptual clarity between ‘intervention’ and ‘implementation’, yet at other times by ambiguity in application. We suggest that both the scientific and the clinical communities would benefit from greater clarity; therefore, in this paper, we address the concepts of intervention and implementation, primarily as in clinical interventions and implementation interventions, and explore the grey area in between.DiscussionTo begin, we consider the similarities, differences and potential greyness between clinical interventions and implementation interventions through an overview of concepts. This is illustrated with reference to two examples of clinical interventions and implementation intervention studies, including the potential ambiguity in between. We then discuss strategies to explore the hybridity of clinical-implementation intervention studies, including the role of theories, frameworks, models, and reporting guidelines that can be applied to help clarify the clinical and implementation intervention, respectively.ConclusionSemantics provide opportunities for improved precision in depicting what is ‘intervention’ and what is ‘implementation’ in health care research. Further, attention to study design, the use of theory, and adoption of reporting guidelines can assist in distinguishing between the clinical intervention and the implementation intervention. However, certain aspects may remain unclear in analyses of hybrid studies of clinical and implementation interventions. Recognizing this potential greyness can inform further discourse.
Background While there is an expectation to demonstrate evidence-informed public health there is an ongoing need for capacity development. The purpose of this paper is to provide a description of a tailored knowledge translation intervention implemented by knowledge brokers (KBs), and reflections on the factors that facilitated or hindered its implementation. Methods The 22-month knowledge translation intervention, implemented by two KBs, sought to facilitate evidence-informed public health decision-making. Data on outcomes were collected using a knowledge, skills and behavioural assessment survey. In addition, the KBs maintained reflective journals noting which activities appeared successful or not, as well as factors related to the individual or the organisation that facilitated or hindered evidence-informed decision-making. Results Tailoring of the knowledge translation intervention to address the needs, preferences and structure of each organisation resulted in three unique interventions being implemented. A consistent finding across organisations was that each site needed to determine where evidence-informed decision-making ‘fit’ within pre-existing organisational processes. Components of the intervention consistent across the three organisations included one-to-one mentoring of teams through rapid evidence reviews, large group workshops and regular meetings with senior management. Components that varied included the frequency of the KB being physically onsite, the amount of time staff spent with the KB and proportion of time spent one-to-one with a KB versus in workshops. Key facilitating factors for implementation included strong leadership, influential power of champions, supportive infrastructure, committed resources and staff enthusiasm. Conclusions The results of this study illustrate the importance of working collaboratively with organisations to tailor knowledge translation interventions to best meet unique needs, preferences, organisational structures and contexts. Organisational factors such as leadership, champions and supportive infrastructure play a key role in determining the impact of the knowledge translation interventions. Future studies should explore how these factors can be fostered and/or developed within organisations. While KBs implemented the knowledge translation intervention in this study, more research is needed to understand the impact of all change agent roles including KBs, as well as how these roles can be maintained in the long-term if proven effective.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.