Scoping reviews, a type of knowledge synthesis, follow a systematic approach to map evidence on a topic and identify main concepts, theories, sources, and knowledge gaps. Although more scoping reviews are being done, their methodological and reporting quality need improvement. This document presents the PRISMA-ScR (Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews) checklist and explanation. The checklist was developed by a 24-member expert panel and 2 research leads following published guidance from the EQUATOR (Enhancing the QUAlity and Transparency Of health Research) Network. The final checklist contains 20 essential reporting items and 2 optional items. The authors provide a rationale and an example of good reporting for each item. The intent of the PRISMA-ScR is to help readers (including researchers, publishers, commissioners, policymakers, health care providers, guideline developers, and patients or consumers) develop a greater understanding of relevant terminology, core concepts, and key items to report for scoping reviews.
BackgroundScoping reviews are used to identify knowledge gaps, set research agendas, and identify implications for decision-making. The conduct and reporting of scoping reviews is inconsistent in the literature. We conducted a scoping review to identify: papers that utilized and/or described scoping review methods; guidelines for reporting scoping reviews; and studies that assessed the quality of reporting of scoping reviews.MethodsWe searched nine electronic databases for published and unpublished literature scoping review papers, scoping review methodology, and reporting guidance for scoping reviews. Two independent reviewers screened citations for inclusion. Data abstraction was performed by one reviewer and verified by a second reviewer. Quantitative (e.g. frequencies of methods) and qualitative (i.e. content analysis of the methods) syntheses were conducted.ResultsAfter searching 1525 citations and 874 full-text papers, 516 articles were included, of which 494 were scoping reviews. The 494 scoping reviews were disseminated between 1999 and 2014, with 45 % published after 2012. Most of the scoping reviews were conducted in North America (53 %) or Europe (38 %), and reported a public source of funding (64 %). The number of studies included in the scoping reviews ranged from 1 to 2600 (mean of 118). Using the Joanna Briggs Institute methodology guidance for scoping reviews, only 13 % of the scoping reviews reported the use of a protocol, 36 % used two reviewers for selecting citations for inclusion, 29 % used two reviewers for full-text screening, 30 % used two reviewers for data charting, and 43 % used a pre-defined charting form. In most cases, the results of the scoping review were used to identify evidence gaps (85 %), provide recommendations for future research (84 %), or identify strengths and limitations (69 %). We did not identify any guidelines for reporting scoping reviews or studies that assessed the quality of scoping review reporting.ConclusionThe number of scoping reviews conducted per year has steadily increased since 2012. Scoping reviews are used to inform research agendas and identify implications for policy or practice. As such, improvements in reporting and conduct are imperative. Further research on scoping review methodology is warranted, and in particular, there is need for a guideline to standardize reporting.Electronic supplementary materialThe online version of this article (doi:10.1186/s12874-016-0116-4) contains supplementary material, which is available to authorized users.
BackgroundRapid reviews are a form of knowledge synthesis in which components of the systematic review process are simplified or omitted to produce information in a timely manner. Although numerous centers are conducting rapid reviews internationally, few studies have examined the methodological characteristics of rapid reviews. We aimed to examine articles, books, and reports that evaluated, compared, used or described rapid reviews or methods through a scoping review.MethodsMEDLINE, EMBASE, the Cochrane Library, internet websites of rapid review producers, and reference lists were searched to identify articles for inclusion. Two reviewers independently screened literature search results and abstracted data from included studies. Descriptive analysis was conducted.ResultsWe included 100 articles plus one companion report that were published between 1997 and 2013. The studies were categorized as 84 application papers, seven development papers, six impact papers, and four comparison papers (one was included in two categories). The rapid reviews were conducted between 1 and 12 months, predominantly in Europe (58 %) and North America (20 %). The included studies failed to report 6 % to 73 % of the specific systematic review steps examined. Fifty unique rapid review methods were identified; 16 methods occurred more than once. Streamlined methods that were used in the 82 rapid reviews included limiting the literature search to published literature (24 %) or one database (2 %), limiting inclusion criteria by date (68 %) or language (49 %), having one person screen and another verify or screen excluded studies (6 %), having one person abstract data and another verify (23 %), not conducting risk of bias/quality appraisal (7 %) or having only one reviewer conduct the quality appraisal (7 %), and presenting results as a narrative summary (78 %). Four case studies were identified that compared the results of rapid reviews to systematic reviews. Three studies found that the conclusions between rapid reviews and systematic reviews were congruent.ConclusionsNumerous rapid review approaches were identified and few were used consistently in the literature. Poor quality of reporting was observed. A prospective study comparing the results from rapid reviews to those obtained through systematic reviews is warranted.Electronic supplementary materialThe online version of this article (doi:10.1186/s12916-015-0465-6) contains supplementary material, which is available to authorized users.
Scoping reviews are an increasingly common approach to evidence synthesis with a growing suite of methodological guidance and resources to assist review authors with their planning, conduct and reporting. The latest guidance for scoping reviews includes the JBI methodology and the Preferred Reporting Items for Systematic Reviews and Meta-Analyses—Extension for Scoping Reviews. This paper provides readers with a brief update regarding ongoing work to enhance and improve the conduct and reporting of scoping reviews as well as information regarding the future steps in scoping review methods development. The purpose of this paper is to provide readers with a concise source of information regarding the difference between scoping reviews and other review types, the reasons for undertaking scoping reviews, and an update on methodological guidance for the conduct and reporting of scoping reviews.Despite available guidance, some publications use the term ‘scoping review’ without clear consideration of available reporting and methodological tools. Selection of the most appropriate review type for the stated research objectives or questions, standardised use of methodological approaches and terminology in scoping reviews, clarity and consistency of reporting and ensuring that the reporting and presentation of the results clearly addresses the review’s objective(s) and question(s) are critical components for improving the rigour of scoping reviews.Rigourous, high-quality scoping reviews should clearly follow up to date methodological guidance and reporting criteria. Stakeholder engagement is one area where further work could occur to enhance integration of consultation with the results of evidence syntheses and to support effective knowledge translation. Scoping review methodology is evolving as a policy and decision-making tool. Ensuring the integrity of scoping reviews by adherence to up-to-date reporting standards is integral to supporting well-informed decision-making.
Objective: To assess the characteristics and core statistical methodology specific to network meta-analyses (NMAs) in clinical research articles. Study Design and What is new?Key findings Although the amount of evidence (the number of treatments and studies) included in published NMAs remains stable, the undertaking and reporting of statistical methods have significantly improved over the years. The assumptions underlying NMA are increasingly discussed and evaluated using appropriate methods. Less than 10% of NMAs published in 2014 and 2015 failed to evaluate the assumptions of the joint synthesis. What this adds to what is knownThis meta-epidemiological study presents the largest collection of published NMAs over the past 16 years. It provides an overview of the structural characteristics and statistical methodology of 456 published networks of interventions. It shows that the statistical methods in NMA have considerably improved in all aspects and some, such as the use of appropriate methods to evaluate the plausibility of the assumptions, are now routinely performed. We conclude that the increasingly populous community of NMA methodologists is quickly advancing through the learning curve of statistical methods employed in NMA. What is the implication, what should change nowThe updated description of the structural characteristics of the published NMAs can be used to inform pragmatic simulations studies and the development of methods that are relevant to the type of networks typically found in the medical literature. Future tutorials and training should be focused on improving the methodology and reporting on items that, although they have improved, their prevalence remains low, such as the formal exploration of heterogeneity and inconsistency and the presentation of all pairwise treatment effects.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.