2023
DOI: 10.11124/jbies-23-00139
|View full text |Cite
|
Sign up to set email alerts
|

Guidance to best tools and practices for systematic reviews

Abstract: Data continue to accumulate indicating that many systematic reviews are methodologically flawed, biased, redundant, or uninformative. Some improvements have occurred in recent years based on empirical methods research and standardization of appraisal tools; however, many authors do not routinely or consistently apply these updated methods. In addition, guideline developers, peer reviewers, and journal editors often disregard current methodological standards. Although extensively acknowledged and explored in th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
11
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 9 publications
(11 citation statements)
references
References 211 publications
(505 reference statements)
0
11
0
Order By: Relevance
“…A timely manuscript in this month's issue of JBI Evidence Synthesis 1 offers an opportunity for reviewers and readers to take stock of and reflect on where the field of evidence synthesis, and organizational directives applicable to the field, including those from JBI, have arrived to date. Kolaski et al 1 highlight current deficiencies and confusion across terminology, methods of synthesis, and the application of the available methods by review authors-all of which rightfully cast doubt on the trustworthiness of many systematic reviews and question their authoritative claim to most appropriately guide decision-making. 1 Noteworthy among these issues, Kolaski et al 1 identify problems associated with the classification of primary study designs and the varying taxonomies and algorithms available to assist with classification.…”
mentioning
confidence: 99%
See 4 more Smart Citations
“…A timely manuscript in this month's issue of JBI Evidence Synthesis 1 offers an opportunity for reviewers and readers to take stock of and reflect on where the field of evidence synthesis, and organizational directives applicable to the field, including those from JBI, have arrived to date. Kolaski et al 1 highlight current deficiencies and confusion across terminology, methods of synthesis, and the application of the available methods by review authors-all of which rightfully cast doubt on the trustworthiness of many systematic reviews and question their authoritative claim to most appropriately guide decision-making. 1 Noteworthy among these issues, Kolaski et al 1 identify problems associated with the classification of primary study designs and the varying taxonomies and algorithms available to assist with classification.…”
mentioning
confidence: 99%
“…Kolaski et al 1 highlight current deficiencies and confusion across terminology, methods of synthesis, and the application of the available methods by review authors-all of which rightfully cast doubt on the trustworthiness of many systematic reviews and question their authoritative claim to most appropriately guide decision-making. 1 Noteworthy among these issues, Kolaski et al 1 identify problems associated with the classification of primary study designs and the varying taxonomies and algorithms available to assist with classification. This issue is not exclusive to primary research; it is also apparent at the secondary research level, where the waters are further muddied by evolving methodologies of synthesis that continue to emerge.…”
mentioning
confidence: 99%
See 3 more Smart Citations