Objective: To assess methods used to identify, analyze, and synthesize results of empirical research on intervention effects, and determine whether published reviews are vulnerable to various sources and types of bias.Methods: Study 1 examined the methods, sources, and conclusions of 37 published reviews of research on effects of a model program. Study 2 compared findings of one published trial with summaries of results of that trial that appeared in published reviews.Results: Study 1: Published reviews varied in terms of the transparency of inclusion criteria, strategies for locating relevant published and unpublished data, standards used to evaluate evidence, and methods used to synthesize results across studies. Most reviews relied solely on narrative analysis of a convenience sample of published studies. None of the reviews used systematic methods to identify, analyze, and synthesize results. Study 2: When results of a single study were traced from the original report to summaries in published reviews, three patterns emerged: a complex set of results was simplified, non-significant results were ignored, and positive results were over-emphasized. Most reviews used a single positive statement to characterize results of a study that were decidedly mixed. This suggests that reviews were influenced by confirmation bias, the tendency to emphasize evidence that supports a hypothesis and ignore evidence to the contrary.Conclusions: Published reviews may be vulnerable to biases that scientific methods of research synthesis were designed to address. This raises important questions about the validity of traditional sources of knowledge about "what works," and suggests need for a renewed commitment to using scientific methods to produce valid evidence for practice.
Reviews of evidence-based practices 2The emphasis on evidence-based practice appears to have renewed interest in "what works" and "what works best for whom" in response to specific conditions, disorders, and psychosocial problems. Policy makers, practitioners, and consumers want to know about the likely benefits, potential harmful effects, and evidentiary status of various interventions (Davies, 2004;Gibbs, 2003). To address these issues, many reviewers have synthesized results of research on the impacts of psychosocial interventions. These reviews appear in numerous books and scholarly journals; concise summaries and lists of "what works" can be found on many government and professional organizations' websites.In the last decade there were rapid developments in the science of research synthesis, following publication of a seminal handbook on this topic (Cooper & Hedges, 1994). Yet, the practice of research synthesis (as represented by the proliferation of published reviews and lists of evidence-based practices) and the science of research synthesis have not been well-connected (Littell, 2005).In this article, I trace the development and dissemination of information about the efficacy and effectiveness of one of the most prominent evidence-based pract...