Evidence-based policy is a dominant theme in contemporary public services but the practical realities and challenges involved in using evidence in policy-making are formidable. Part of the problem is one of complexity. In health services and other public services, we are dealing with complex social interventions which act on complex social systems--things like league tables, performance measures, regulation and inspection, or funding reforms. These are not 'magic bullets' which will always hit their target, but programmes whose effects are crucially dependent on context and implementation. Traditional methods of review focus on measuring and reporting on programme effectiveness, often find that the evidence is mixed or conflicting, and provide little or no clue as to why the intervention worked or did not work when applied in different contexts or circumstances, deployed by different stakeholders, or used for different purposes. This paper offers a model of research synthesis which is designed to work with complex social interventions or programmes, and which is based on the emerging 'realist' approach to evaluation. It provides an explanatory analysis aimed at discerning what works for whom, in what circumstances, in what respects and how. The first step is to make explicit the programme theory (or theories)--the underlying assumptions about how an intervention is meant to work and what impacts it is expected to have. We then look for empirical evidence to populate this theoretical framework, supporting, contradicting or modifying the programme theories as it goes. The results of the review combine theoretical understanding and empirical evidence, and focus on explaining the relationship between the context in which the intervention is applied, the mechanisms by which it works and the outcomes which are produced. The aim is to enable decision-makers to reach a deeper understanding of the intervention and how it can be made to work most effectively. Realist review does not provide simple answers to complex questions. It will not tell policy-makers or managers whether something works or not, but will provide the policy and practice community with the kind of rich, detailed and highly practical understanding of complex social interventions which is likely to be of much more use to them when planning and implementing programmes at a national, regional or local level.
The argument put forward in this paper is that successful implementation of research into practice is a function of the interplay of three core elements-the level and nature of the evidence, the context or environment into which the research is to be placed, and the method or way in which the process is facilitated. It also proposes that because current research is inconclusive as to which of these elements is most important in successful implementation they all should have equal standing. This is contrary to the often implicit assumptions currently being generated within the clinical eVectiveness agenda where the level and rigour of the evidence seems to be the most important factor for consideration. The paper oVers a conceptual framework that considers this imbalance, showing how it might work in clarifying some of the theoretical positions and as a checklist for staV to assess what they need to do to successfully implement research into practice.
BackgroundThe Promoting Action on Research Implementation in Health Services, or PARIHS framework, was first published in 1998. Since this time, work has been ongoing to further develop, refine and test it. Widely used as an organising or conceptual framework to help both explain and predict why the implementation of evidence into practice is or is not successful, PARIHS was one of the first frameworks to make explicit the multi-dimensional and complex nature of implementation as well as highlighting the central importance of context. Several critiques of the framework have also pointed out its limitations and suggested areas for improvement.DiscussionBuilding on the published critiques and a number of empirical studies, this paper introduces a revised version of the framework, called the integrated or i-PARIHS framework. The theoretical antecedents of the framework are described as well as outlining the revised and new elements, notably, the revision of how evidence is described; how the individual and teams are incorporated; and how context is further delineated. We describe how the framework can be operationalised and draw on case study data to demonstrate the preliminary testing of the face and content validity of the revised framework.SummaryThis paper is presented for deliberation and discussion within the implementation science community. Responding to a series of critiques and helpful feedback on the utility of the original PARIHS framework, we seek feedback on the proposed improvements to the framework. We believe that the i-PARIHS framework creates a more integrated approach to understand the theoretical complexity from which implementation science draws its propositions and working hypotheses; that the new framework is more coherent and comprehensive and at the same time maintains it intuitive appeal; and that the models of facilitation described enable its more effective operationalisation.Electronic supplementary materialThe online version of this article (doi:10.1186/s13012-016-0398-2) contains supplementary material, which is available to authorized users.
Background: The PARiHS framework (Promoting Action on Research Implementation in Health Services) has proved to be a useful practical and conceptual heuristic for many researchers and practitioners in framing their research or knowledge translation endeavours. However, as a conceptual framework it still remains untested and therefore its contribution to the overall development and testing of theory in the field of implementation science is largely unquantified.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.