2018
DOI: 10.1016/j.jclinepi.2017.12.001
|View full text |Cite
|
Sign up to set email alerts
|

A retrospective comparison of systematic reviews with same-topic rapid reviews

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
37
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 43 publications
(37 citation statements)
references
References 22 publications
0
37
0
Order By: Relevance
“…19 For example, only one-fourth of the reviews assessed the methodological quality of primary studies using a standardized tool, such as Joanna Briggs Institute (JBI) critical appraisal tools, Downs and Black Involving only one reviewer to screen studies, conducting review without adequate expertise or consulting experts, including small number of studies and faster completion with poorer reporting quality could be possible reasons. 42 Furthermore, lack of wellestablished and harmonized criteria for academic carrier promotion and incentivization, which may lead researchers to focus only on the number of publication instead of ensuring the quality of reviews.…”
Section: Discussionmentioning
confidence: 99%
“…19 For example, only one-fourth of the reviews assessed the methodological quality of primary studies using a standardized tool, such as Joanna Briggs Institute (JBI) critical appraisal tools, Downs and Black Involving only one reviewer to screen studies, conducting review without adequate expertise or consulting experts, including small number of studies and faster completion with poorer reporting quality could be possible reasons. 42 Furthermore, lack of wellestablished and harmonized criteria for academic carrier promotion and incentivization, which may lead researchers to focus only on the number of publication instead of ensuring the quality of reviews.…”
Section: Discussionmentioning
confidence: 99%
“…Application of such software can substantially reduce reviewers' workload and expedite the dissemination of findings, especially in cases where reviewers have to screen a large number of records. Of note, a recent study comparing systematic reviews and sametopic rapid reviews found that median duration of completion was 9.5 months for systematic reviews and 3 months for rapid reviews, although findings were generally consistent between the two types of reviews [54]. Limitations should also be acknowledged at the study and review level.…”
Section: Strengths and Limitationsmentioning
confidence: 95%
“…Moreover, we focused on the highest quality of evidence derived exclusively from RCTs and, in comparison with a previous systematic review on shared decision-making and outcomes in Type 2 diabetes [8], we included additional data from seven publications [23][24][25][26][27]36,38] that assessed different types of PtDAs. Of note, a recent study comparing systematic reviews and sametopic rapid reviews found that median duration of completion was 9.5 months for systematic reviews and 3 months for rapid reviews, although findings were generally consistent between the two types of reviews [54]. In particular, when applied prior to screening abstracts, the RCT model accurately excluded approximately half of search records as non-RCTs.…”
Section: Strengths and Limitationsmentioning
confidence: 99%
“…Although the reliability of rapid review findings might be limited compared to that of systematic reviews, decisions-makers increasingly request rapid review products to answer urgent clinical or public health questions [1,2]. For example, Australian policy agencies used 134 of 150 commissioned rapid reviews (89%) to decide the details of a policy or program, identify priorities for future action, or communicate evidence to stakeholders [3].…”
Section: Introductionmentioning
confidence: 99%