2020
DOI: 10.1371/journal.pone.0233107
|View full text |Cite
|
Sign up to set email alerts
|

Reproducibility of individual effect sizes in meta-analyses in psychology

Abstract: To determine the reproducibility of psychological meta-analyses, we investigated whether we could reproduce 500 primary study effect sizes drawn from 33 published meta-analyses based on the information given in the meta-analyses, and whether recomputations of primary study effect sizes altered the overall results of the meta-analysis. Results showed that almost half (k = 224) of all sampled primary effect sizes could not be reproduced based on the reported information in the meta-analysis, mostly because of in… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
57
0
2

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 58 publications
(61 citation statements)
references
References 30 publications
2
57
0
2
Order By: Relevance
“…The scientific methods for meta-analysis are well-established (Borenstein et al, 2009;Cooper et al, 2019;Glass, 2015;. Other recommended practices go further, also recommending sharing data and code for quality control, reproducibility of results, and updating earlier meta-analyses (Button et al, 2013;Gurevitch et al, 2018;Lakens et al, 2016;Maassen et al, 2020;Polanin et al, 2020). However, even the well-established methods for research synthesis are not universally followed.…”
Section: Resultsmentioning
confidence: 99%
“…The scientific methods for meta-analysis are well-established (Borenstein et al, 2009;Cooper et al, 2019;Glass, 2015;. Other recommended practices go further, also recommending sharing data and code for quality control, reproducibility of results, and updating earlier meta-analyses (Button et al, 2013;Gurevitch et al, 2018;Lakens et al, 2016;Maassen et al, 2020;Polanin et al, 2020). However, even the well-established methods for research synthesis are not universally followed.…”
Section: Resultsmentioning
confidence: 99%
“…We chose to check RECALL performance first, given that RECALL as outcome is rather clear (i.e., what constitutes RECALL opposed to what constitutes TRANSFER). For effect size recalculation, we followed the workflow of Maassen et al ( 2020 ; their Figure 1, page 5): We identified appropriate (composite) effects in primary studies based on available information in the meta-analysis; when more than one (composite) effect calculation was possible, we calculated multiple logically possible effects and chose the effect size closest to the primary effect reported in the meta-analysis.…”
Section: Guidelines and Inconsistencies In Xie’s Et Al’s ( mentioning
confidence: 99%
“…Such problems are not confined to Xie’s and colleagues’ meta-analysis (cf. Maassen et al 2020 ); nevertheless, they undermine reproducibility, given that it was often unclear to us exactly which effect sizes were selected, whether and how they were combined, and exactly which calculations steps and formula were applied.…”
Section: Guidelines and Inconsistencies In Xie’s Et Al’s ( mentioning
confidence: 99%
See 1 more Smart Citation
“…The scientific methods for meta-analysis are well-established ( Borenstein et al, 2009 ; Koricheva et al, 2013 ; Glass, 2015 ; Cooper et al, 2019 ). Other recommended practices go further, also recommending sharing data and code for quality control, reproducibility of results, and updating earlier meta-analyses ( Button et al, 2013 ; Lakens et al, 2016 ; Gurevitch et al, 2018 ; Maassen et al, 2020 ; Polanin et al, 2020 ). However, even the well-established methods for research synthesis are not universally followed.…”
Section: Discussionmentioning
confidence: 99%