BackgroundThe gap between research and practice or policy is often described as a problem. To identify new barriers of and facilitators to the use of evidence by policymakers, and assess the state of research in this area, we updated a systematic review.MethodsSystematic review. We searched online databases including Medline, Embase, SocSci Abstracts, CDS, DARE, Psychlit, Cochrane Library, NHSEED, HTA, PAIS, IBSS (Search dates: July 2000 - September 2012). Studies were included if they were primary research or systematic reviews about factors affecting the use of evidence in policy. Studies were coded to extract data on methods, topic, focus, results and population.Results145 new studies were identified, of which over half were published after 2010. Thirteen systematic reviews were included. Compared with the original review, a much wider range of policy topics was found. Although still primarily in the health field, studies were also drawn from criminal justice, traffic policy, drug policy, and partnership working. The most frequently reported barriers to evidence uptake were poor access to good quality relevant research, and lack of timely research output. The most frequently reported facilitators were collaboration between researchers and policymakers, and improved relationships and skills. There is an increasing amount of research into new models of knowledge transfer, and evaluations of interventions such as knowledge brokerage.ConclusionsTimely access to good quality and relevant research evidence, collaborations with policymakers and relationship- and skills-building with policymakers are reported to be the most important factors in influencing the use of evidence. Although investigations into the use of evidence have spread beyond the health field and into more countries, the main barriers and facilitators remained the same as in the earlier review. Few studies provide clear definitions of policy, evidence or policymaker. Nor are empirical data about policy processes or implementation of policy widely available. It is therefore difficult to describe the role of evidence and other factors influencing policy. Future research and policy priorities should aim to illuminate these concepts and processes, target the factors identified in this review, and consider new methods of overcoming the barriers described.
Our findings are consistent with the idea that 'downstream' preventive interventions are more likely to increase health inequalities than 'upstream' interventions. More consistent reporting of differential intervention effectiveness is required to help build the evidence base on IGIs.
Randomized trials of complex public health interventions generally aim to identify what works, accrediting specific intervention 'products' as effective. This approach often fails to give sufficient consideration to how intervention components interact with each other and with local context. 'Realists' argue that trials misunderstand the scientific method, offer only a 'successionist' approach to causation, which brackets out the complexity of social causation, and fail to ask which interventions work, for whom and under what circumstances. We counter-argue that trials are useful in evaluating social interventions because randomized control groups actually take proper account of rather than bracket out the complexity of social causation. Nonetheless, realists are right to stress understanding of 'what works, for whom and under what circumstances' and to argue for the importance of theorizing and empirically examining underlying mechanisms. We propose that these aims can be (and sometimes already are) examined within randomized trials. Such 'realist' trials should aim to: examine the effects of intervention components separately and in combination, for example using multi-arm studies and factorial trials; explore mechanisms of change, for example analysing how pathway variables mediate intervention effects; use multiple trials across contexts to test how intervention effects vary with context; draw on complementary qualitative and quantitative data; and be oriented towards building and validating 'mid-level' program theories which would set out how interventions interact with context to produce outcomes. This last suggestion resonates with recent suggestions that, in delivering truly 'complex' interventions, fidelity is important not so much in terms of precise activities but, rather, key intervention 'processes' and 'functions'. Realist trials would additionally determine the validity of program theory rather than only examining 'what works' to better inform policy and practice in the long-term.
Despite 40 years of research into evidence-based policy (EBP) and a continued drive from both policymakers and researchers to increase research uptake in policy, barriers to the use of evidence are persistently identified in the literature. However, it is not clear what explains this persistence – whether they represent real factors, or if they are artefacts of approaches used to study EBP. Based on an updated review, this paper analyses this literature to explain persistent barriers and facilitators. We critically describe the literature in terms of its theoretical underpinnings, definitions of ‘evidence’, methods, and underlying assumptions of research in the field, and aim to illuminate the EBP discourse by comparison with approaches from other fields. Much of the research in this area is theoretically naive, focusing primarily on the uptake of research evidence as opposed to evidence defined more broadly, and privileging academics’ research priorities over those of policymakers. Little empirical data analysing the processes or impact of evidence use in policy is available to inform researchers or decision-makers. EBP research often assumes that policymakers do not use evidence and that more evidence – meaning research evidence – use would benefit policymakers and populations. We argue that these assumptions are unsupported, biasing much of EBP research. The agenda of ‘getting evidence into policy’ has side-lined the empirical description and analysis of how research and policy actually interact in vivo. Rather than asking how research evidence can be made more influential, academics should aim to understand what influences and constitutes policy, and produce more critically and theoretically informed studies of decision-making. We question the main assumptions made by EBP researchers, explore the implications of doing so, and propose new directions for EBP research, and health policy.
This paper presents the findings from a review of the theoretical and empirical literature on the links between crime and fear of crime, the social and built environment, and health and wellbeing. A pragmatic approach was employed, with iterative stages of searching and synthesis. This produced a holistic causal framework of pathways to guide future research. The framework emphasises that crime and fear of crime may have substantial impacts on wellbeing, but the pathways are often highly indirect, mediated by environmental factors, difficult to disentangle and not always in the expected direction. The built environment, for example, may affect health via its impacts on health behaviours; via its effects on crime and fear of crime; or via the social environment. The framework also helps to identify unexpected factors which may affect intervention success, such as the risk of adverse effects from crime prevention interventions as a result of raising awareness of crime.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.