Introduction Assessing the process used to synthesize the evidence in clinical practice guidelines enables users to determine the trustworthiness of the recommendations. Clinicians are increasingly dependent on guidelines to keep up with vast quantities of medical literature, and guidelines are followed to avoid malpractice suits. We aimed to assess whether systematic methods were used when synthesizing the evidence for guidelines; and to determine the type of review cited in support of recommendations. Methods Guidelines published in 2017 and 2018 were retrieved from the TRIP and Epistemonikos databases. We randomly sorted and sequentially screened clinical guidelines on all topics to select the first 50 that met our inclusion criteria. Our primary outcomes were the number of guidelines using either a systematic or non-systematic process to gather, assess, and synthesise evidence; and the numbers of recommendations within guidelines based on different types of evidence synthesis (systematic or non-systematic reviews). If a review was cited, we looked for evidence that it was critically appraised, and recorded which quality assessment tool was used. Finally, we examined the relation between the use of the GRADE approach, systematic review process, and type of funder. Results Of the 50 guidelines, 17 (34%) systematically synthesised the evidence to inform recommendations. These 17 guidelines clearly reported their objectives and eligibility criteria, conducted comprehensive search strategies, and assessed the quality of the studies. Of the 29/50 guidelines that included reviews, 6 (21%) assessed the risk of bias of the review. The quality of primary studies was reported in 30/50 (60%) guidelines. Conclusions High quality, systematic review products provide the best available evidence to inform guideline recommendations. Using non-systematic methods compromises the validity and reliability of the evidence used to inform guideline recommendations, leading to potentially misleading and untrustworthy results.
Objectives: The aim of the study was to validate search filters for retrieval of clinical practice guidelines (CPGs) in MEDLINE, Embase, and PubMed. Study Design and Setting: A search for filters for identifying CPGs was conducted in Google and the InterTASC Information Specialists SubGroup Search Filter Resource. To retrieve a random sample of CPGs to test sensitivity and precision of the filters, we used the TRIP and Epistemonikos databases. The citations were screened independently by two researchers. The sensitivity and precision were calculated. Results: Five search filters were retrieved: two from the Canadian Agency for Drugs and Technologies in Health (CADTH), two from the University of Texas, and one from the MD Anderson Cancer Center Library. A total of 478 records were screened to identify 109 CPGs, which comprised the sample for testing sensitivity and precision. The sensitivity ranged from 87% to 98% for the five search filters and very low precision (!1%) across all databases. Conclusion: Knowledge users who are interested in retrieving all relevant CPGs can use the CADTH broad filter with the highest sensitivity. However, our analysis shows that it remains difficult to efficiently identify CPGs because of low precision of five search filters. We recommend searching guideline-specific resources as a more time-efficient approach than searching bibliographic databases.
IntroductionGuidelines are systematically developed recommendations to assist practitioner and patient decisions about treatments for clinical conditions. High quality and comprehensive systematic reviews and ‘overviews of systematic reviews’ (overviews) represent the best available evidence. Many guideline developers, such as the WHO and the Australian National Health and Medical Research Council, recommend the use of these research syntheses to underpin guideline recommendations. We aim to evaluate the impact and use of systematic reviews with and without pairwise meta-analysis or network meta-analyses (NMAs) and overviews in clinical practice guideline (CPG) recommendations.Methods and analysisCPGs will be retrieved from Turning Research Into Practice and Epistemonikos (2017–2018). The retrieved citations will be sorted randomly and then screened sequentially by two independent reviewers until 50 CPGs have been identified. We will include CPGs that provide at least two explicit recommendations for the management of any clinical condition. We will assess whether reviews or overviews were cited in a recommendation as part of the development process for guidelines. Data extraction will be done independently by two authors and compared. We will assess the risk of bias by examining how each guideline developed clinical recommendations. We will calculate the number and frequency of citations of reviews with or without pairwise meta-analysis, reviews with NMAs and overviews, and whether they were systematically or non-systematically developed. Results will be described, tabulated and categorised based on review type (reviews or overviews). CPGs reporting the use of the Grading of Recommendations, Assessment, Development and Evaluation approach will be compared with those using a different system, and pharmacological versus non-pharmacological CPGs will be compared.Ethics and disseminationNo ethics approval is required. We will present at the Cochrane Colloquium and the Guidelines International Network conference.
BackgroundGuidelines are systematically developed recommendations to assist practitioner and patient decisions about treatments for clinical conditions. Researchers, healthcare professionals and policy makers need to be able to retrieve clinical practice guidelines (CPGs) efficiently and quickly from the literature. Despite the widespread use of CPGs in practice and policy formulation, no filter for retrieval of guidelines has been validated to date. The use of a validated search filter for CPGs would make their retrieval from major bibliographic databases more efficient. ObjectivesWe aim to fill this gap by validating search filters for use in the systematic retrieval of CPGs and measure their performance according to sensitivity and precision. MethodsWe found four search filters for retrieval of CPGs (two CADTH, PubMed and University of Texas filters) which we will validate in three databases (MEDLINE, Embase and PubMed). We will derive a test set of CPGs from a search of the TRIP and Epistemonikos databases. The citations retrieved will be randomly sorted and screened sequentially by two reviewers until at least 100 CPGs are included. We will include CPGs that provide at least two explicit recommendations for the treatment of any clinical condition, and that are produced by a group or organization (i.e., not authored by one person). We will translate the filters into Ovid MEDLINE, Embase, and PubMed syntax as appropriate. Then, we will run the strategies and assess whether the filters retrieved the citations in our test set. We will calculate and compare the sensitivity and precision of the four filters in each database. The limitations of the CADTH, PubMed and University of Texas search filters for each database will be assessed by examining the keywords in the titles and abstracts of the citations not found by the search filters. Discussion Decision makers, healthcare providers and researcher will be able to choose the most precise and sensitive search strategy among the four available, which will enable them to more efficiently identify relevant clinical practice guidelines.PeerJ Preprints | https://doi.org/10.7287/peerj.preprints.27149v1 | CC BY 4.0 Open Access | rec
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.