Evidence-based policy is a dominant theme in contemporary public services but the practical realities and challenges involved in using evidence in policy-making are formidable. Part of the problem is one of complexity. In health services and other public services, we are dealing with complex social interventions which act on complex social systems--things like league tables, performance measures, regulation and inspection, or funding reforms. These are not 'magic bullets' which will always hit their target, but programmes whose effects are crucially dependent on context and implementation. Traditional methods of review focus on measuring and reporting on programme effectiveness, often find that the evidence is mixed or conflicting, and provide little or no clue as to why the intervention worked or did not work when applied in different contexts or circumstances, deployed by different stakeholders, or used for different purposes. This paper offers a model of research synthesis which is designed to work with complex social interventions or programmes, and which is based on the emerging 'realist' approach to evaluation. It provides an explanatory analysis aimed at discerning what works for whom, in what circumstances, in what respects and how. The first step is to make explicit the programme theory (or theories)--the underlying assumptions about how an intervention is meant to work and what impacts it is expected to have. We then look for empirical evidence to populate this theoretical framework, supporting, contradicting or modifying the programme theories as it goes. The results of the review combine theoretical understanding and empirical evidence, and focus on explaining the relationship between the context in which the intervention is applied, the mechanisms by which it works and the outcomes which are produced. The aim is to enable decision-makers to reach a deeper understanding of the intervention and how it can be made to work most effectively. Realist review does not provide simple answers to complex questions. It will not tell policy-makers or managers whether something works or not, but will provide the policy and practice community with the kind of rich, detailed and highly practical understanding of complex social interventions which is likely to be of much more use to them when planning and implementing programmes at a national, regional or local level.
BackgroundThere is growing interest in realist synthesis as an alternative systematic review method. This approach offers the potential to expand the knowledge base in policy-relevant areas - for example, by explaining the success, failure or mixed fortunes of complex interventions. No previous publication standards exist for reporting realist syntheses. This standard was developed as part of the RAMESES (Realist And MEta-narrative Evidence Syntheses: Evolving Standards) project. The project's aim is to produce preliminary publication standards for realist systematic reviews.MethodsWe (a) collated and summarized existing literature on the principles of good practice in realist syntheses; (b) considered the extent to which these principles had been followed by published syntheses, thereby identifying how rigor may be lost and how existing methods could be improved; (c) used a three-round online Delphi method with an interdisciplinary panel of national and international experts in evidence synthesis, realist research, policy and/or publishing to produce and iteratively refine a draft set of methodological steps and publication standards; (d) provided real-time support to ongoing realist syntheses and the open-access RAMESES online discussion list so as to capture problems and questions as they arose; and (e) synthesized expert input, evidence syntheses and real-time problem analysis into a definitive set of standards.ResultsWe identified 35 published realist syntheses, provided real-time support to 9 on-going syntheses and captured questions raised in the RAMESES discussion list. Through analysis and discussion within the project team, we summarized the published literature and common questions and challenges into briefing materials for the Delphi panel, comprising 37 members. Within three rounds this panel had reached consensus on 19 key publication standards, with an overall response rate of 91%.ConclusionThis project used multiple sources to develop and draw together evidence and expertise in realist synthesis. For each item we have included an explanation for why it is important and guidance on how it might be reported. Realist synthesis is a relatively new method for evidence synthesis and as experience and methodological developments occur, we anticipate that these standards will evolve to reflect further methodological developments. We hope that these standards will act as a resource that will contribute to improving the reporting of realist syntheses.To encourage dissemination of the RAMESES publication standards, this article is co-published in the Journal of Advanced Nursing and is freely accessible on Wiley Online Library (http://www.wileyonlinelibrary.com/journal/jan).Please see related article http://www.biomedcentral.com/1741-7015/11/20 and http://www.biomedcentral.com/1741-7015/11/22
Evaluation research is tortured by time constraints. The policy cycle revolves more quickly than the research cycle, with the result that 'real time' evaluations often have little influence on policy making. As a result, the quest for evidence-based policy (EBP) has turned increasingly to systematic reviews of the results of previous inquiries in the relevant policy domain. However, this shifting of the temporal frame for evaluation is in itself no guarantee of success. Evidence, whether new or old, never speaks for itself. Accordingly, there is debate about the best strategy of marshalling bygone research results into the policy process. In the first of this pair of articles (published in the previous issue of Evaluation) a critical review of the existing EBP strategies was conducted. This companion article considers the merits of a new methodology for systematic reviews, namely 'realist synthesis'. K E Y WO R D S : evidence-based policy; incentives; realism; systematic review
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.