Background Literature searches underlie the foundations of systematic reviews and related review types. Yet, the literature searching component of systematic reviews and related review types is often poorly reported. Guidance for literature search reporting has been diverse, and, in many cases, does not offer enough detail to authors who need more specific information about reporting search methods and information sources in a clear, reproducible way. This document presents the PRISMA-S (Preferred Reporting Items for Systematic reviews and Meta-Analyses literature search extension) checklist, and explanation and elaboration. Methods The checklist was developed using a 3-stage Delphi survey process, followed by a consensus conference and public review process. Results The final checklist includes 16 reporting items, each of which is detailed with exemplar reporting and rationale. Conclusions The intent of PRISMA-S is to complement the PRISMA Statement and its extensions by providing a checklist that could be used by interdisciplinary authors, editors, and peer reviewers to verify that each component of a search is completely reported and therefore reproducible.
Background Stringent requirements exist regarding the transparency of the study selection process and the reliability of results. A 2-step selection process is generally recommended; this is conducted by 2 reviewers independently of each other (conventional double-screening). However, the approach is resource intensive, which can be a problem, as systematic reviews generally need to be completed within a defined period with a limited budget. The aim of the following methodological systematic review was to analyse the evidence available on whether single screening is equivalent to double screening in the screening process conducted in systematic reviews. Methods We searched Medline, PubMed and the Cochrane Methodology Register (last search 10/2018). We also used supplementary search techniques and sources (“similar articles” function in PubMed, conference abstracts and reference lists). We included all evaluations comparing single with double screening. Data were summarized in a structured, narrative way. Results The 4 evaluations included investigated a total of 23 single screenings (12 sets for screening involving 9 reviewers). The median proportion of missed studies was 5% (range 0 to 58%). The median proportion of missed studies was 3% for the 6 experienced reviewers (range: 0 to 21%) and 13% for the 3 reviewers with less experience (range: 0 to 58%). The impact of missing studies on the findings of meta-analyses had been reported in 2 evaluations for 7 single screenings including a total of 18,148 references. In 3 of these 7 single screenings – all conducted by the same reviewer (with less experience) – the findings would have changed substantially. The remaining 4 of these 7 screenings were conducted by experienced reviewers and the missing studies had no impact or a negligible on the findings of the meta-analyses. Conclusions Single screening of the titles and abstracts of studies retrieved in bibliographic searches is not equivalent to double screening, as substantially more studies are missed. However, in our opinion such an approach could still represent an appropriate methodological shortcut in rapid reviews, as long as it is conducted by an experienced reviewer. Further research on single screening is required, for instance, regarding factors influencing the number of studies missed. Electronic supplementary material The online version of this article (10.1186/s12874-019-0782-0) contains supplementary material, which is available to authorized users.
Background: Literature searches underlie the foundations of systematic reviews and related review types. Yet, the literature searching component of systematic reviews and related review types is often poorly reported. Guidance for literature search reporting has been diverse and, in many cases, does not offer enough detail to authors who need more specific information about reporting search methods and information sources in a clear, reproducible way. This document presents the PRISMA-S (Preferred Reporting Items for Systematic reviews and Meta-Analyses literature search extension) checklist, and explanation and elaboration.Methods: The checklist was developed using a three-stage Delphi survey process, followed by a consensus conference and public review process.Results: The final checklist includes sixteen reporting items, each of which is detailed with exemplar reporting and rationale.Conclusions:The intent of PRISMA-S is to complement the PRISMA Statement and its extensions by providing a checklist that could be used by interdisciplinary authors, editors, and peer reviewers to verify that each component of a search is completely reported and, therefore, reproducible.
BackgroundOver the past few years, information retrieval has become more and more professionalized, and information specialists are considered full members of a research team conducting systematic reviews. Research groups preparing systematic reviews and clinical practice guidelines have been the driving force in the development of search strategies, but open questions remain regarding the transparency of the development process and the available resources. An empirically guided approach to the development of a search strategy provides a way to increase transparency and efficiency.MethodsOur aim in this paper is to describe the empirically guided development process for search strategies as applied by the German Institute for Quality and Efficiency in Health Care (Institut für Qualität und Wirtschaftlichkeit im Gesundheitswesen, or "IQWiG"). This strategy consists of the following steps: generation of a test set, as well as the development, validation and standardized documentation of the search strategy.ResultsWe illustrate our approach by means of an example, that is, a search for literature on brachytherapy in patients with prostate cancer. For this purpose, a test set was generated, including a total of 38 references from 3 systematic reviews. The development set for the generation of the strategy included 25 references. After application of textual analytic procedures, a strategy was developed that included all references in the development set. To test the search strategy on an independent set of references, the remaining 13 references in the test set (the validation set) were used. The validation set was also completely identified.DiscussionOur conclusion is that an objectively derived approach similar to that used in search filter development is a feasible way to develop and validate reliable search strategies. Besides creating high-quality strategies, the widespread application of this approach will result in a substantial increase in the transparency of the development process of search strategies.
This preprint has been submitted for peer review. OBJECTIVES: Despite wide usage of the PRISMA Statement reporting guideline by systematic review authors, compliance with its items regarding literature search reporting is suboptimal. We sought to develop an international standard for literature search reporting aligned with the PRISMA Statement to improve the quality and reproducibility of reported literature searches.METHODOLOGY: We formed an executive committee to lead the PRISMA-S extension development. The study protocol was published prior to study inception. To identify potential items for inclusion, we performed a literature search. Identified items were reviewed for overlap and consolidated. We then used a three-step Delphi survey process to assess the items. The first survey asked 163 international experts to rate each item, and the second and third rounds asked respondents to select the 25 most necessary items for a checklist. Potential items moved to rounds 2 and 3 based on pre-specified criteria. Remaining items were discussed at an in-person consensus conference. After the consensus conference, the remaining items were consolidated into a checklist. Executive committee members developed an accompanying explanation and elaboration document. The checklist and documentation were distributed for pilot testing. RESULTS: We identified 405 potential items from 61 sources located through the literature search process. Sources included both explicit reporting guidelines and studies assessing reproducibility of search strategies. These were consolidated into 123 potential items for the Delphi survey. We received 52 responses (32% response rate) to the first survey, and 35 (67% response rate) to both surveys two and three. The results of the Delphi process were reported at the consensus conference meeting in May 2016. Post-consensus conference, 34 items remained. The checklist was finalized into 13 items and 10 sub-items. Pilot testing is underway. This document is the draft for the Explanation & Elaboration for Draft 1, released for review/testing.CONCLUSIONS: The PRISMA-S extension for the PReferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) reporting guideline will provide a consensus-driven international standard for literature search reporting. Using this new reporting guideline may enable librarians and information specialists to produce higher quality, more reproducible, transparent search strategies for systematic reviews and other literature review-based publications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.