2019
DOI: 10.1186/s13012-019-0897-z
|View full text |Cite
|
Sign up to set email alerts
|

Designing high-quality implementation research: development, application, feasibility and preliminary evaluation of the implementation science research development (ImpRes) tool and guide

Abstract: Background Designing implementation research can be a complex and daunting task, especially for applied health researchers who have not received specialist training in implementation science. We developed the Implementation Science Research Development (ImpRes) tool and supplementary guide to address this challenge and provide researchers with a systematic approach to designing implementation research. Methods A multi-method and multi-stage approach was employed. An int… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
86
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
10

Relationship

1
9

Authors

Journals

citations
Cited by 82 publications
(86 citation statements)
references
References 48 publications
0
86
0
Order By: Relevance
“…Quantitative data may include scores on scales validated to examine concepts such as organizational readiness to change and Organizational Change Manager scores 25. While CFIR is the most widely adopted, there are other tools such as the Implementation Science Research Development (ImpRes) offered to help design high-quality implementation research 26…”
Section: Outcomes Of Interest In Implementation Sciencementioning
confidence: 99%
“…Quantitative data may include scores on scales validated to examine concepts such as organizational readiness to change and Organizational Change Manager scores 25. While CFIR is the most widely adopted, there are other tools such as the Implementation Science Research Development (ImpRes) offered to help design high-quality implementation research 26…”
Section: Outcomes Of Interest In Implementation Sciencementioning
confidence: 99%
“…Implementation research aims to close the research-topractice gap, support scale-up of evidence-based interventions and reduce research waste [1,2]. The field of implementation science has gained recognition over the last 10 years, with advances in effectiveness-implementation hybrid designs [3], frameworks that inform the determinants, processes and evaluation of implementation efforts [4][5][6][7], reporting guidance [8] and educational resources [9]. An essential component of these recent developments, and of the field as a whole, is the use of valid and reliable implementation outcome instruments.…”
Section: Introductionmentioning
confidence: 99%
“…[57] Fidelity assessment (process evaluation) [58] Mechanism of change (process evaluation) [59] Action Using the Theory Comparison and Selection Tool (T-CaST) and checklist [51], the AACTT framework is designed to be usable (includes relevant domains, has been developed so that key stakeholders can use it, we provide steps for its application and methods for promoting its application across a range of possible studies and an explanation for how the domains influence each other), testable/valid (can form the basis for testable hypotheses, includes face-valid explanations and has been used in empirical studies), applicable (focuses on a key implementation outcome, can be applied across a range of methods and across a range of analytical levels, populations and conditions and is generalisable across disciplines) and is likely to be acceptable (to key stakeholders, and is the historical evolution of a framework rooted in a particular discipline). Thus, in principle it fulfils all the criteria for use of a framework by implementation science researchers and practitioners, though its actual usability, testability, applicability and acceptability will ultimately be determined through application of the tool across a range of types of implementation research [52].…”
Section: Discussionmentioning
confidence: 99%