2021
DOI: 10.1098/rsos.201494
|View full text |Cite
|
Sign up to set email alerts
|

Analytic reproducibility in articles receiving open data badges at the journal Psychological Science : an observational study

Abstract: For any scientific report, repeating the original analyses upon the original data should yield the original outcomes. We evaluated analytic reproducibility in 25 Psychological Science articles awarded open data badges between 2014 and 2015. Initially, 16 (64%, 95% confidence interval [43,81]) articles contained at least one ‘major numerical discrepancy' (>10% difference) prompting us to request input from original authors. Ultimately, target values were reproducible without author in… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
64
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3
1

Relationship

1
9

Authors

Journals

citations
Cited by 75 publications
(67 citation statements)
references
References 33 publications
2
64
1
Order By: Relevance
“…Second, although the focus of this study was transparency and reproducibility-related practices, this does not imply that the adoption of these practices is sufficient to promote the goals they are intended to achieve. For example, poorly documented data may not enable analytic reproducibility (Hardwicke, Bohn, et al, 2020;Hardwicke et al, 2018), and inadequately specified preregistrations may not sufficiently constrain researcher degrees of freedom (Claesen et al, 2019;Bakker et al, 2020). Third, we relied only on published information.…”
Section: Discussionmentioning
confidence: 99%
“…Second, although the focus of this study was transparency and reproducibility-related practices, this does not imply that the adoption of these practices is sufficient to promote the goals they are intended to achieve. For example, poorly documented data may not enable analytic reproducibility (Hardwicke, Bohn, et al, 2020;Hardwicke et al, 2018), and inadequately specified preregistrations may not sufficiently constrain researcher degrees of freedom (Claesen et al, 2019;Bakker et al, 2020). Third, we relied only on published information.…”
Section: Discussionmentioning
confidence: 99%
“…who assessed the reproducibility of papers published in Cognition, estimated a reproducibility rate of 31% without author assistance, and an additional increase of 31% when the authors of the original manuscript helped to reproduce the outcomes. Similarly,Hardwicke et al (2021) report a reproducibility rate of 36% without author assistance, and an increase of 24% with author assistance for the articles published in Psychological Science that received an 'open data badge'. Similar reproducibility rates were reported for political science…”
mentioning
confidence: 95%
“…Data, code, materials, and design and analysis transparency standards require that the respective elements are shared with the readers, thereby allowing for thorough checking and reproducibility of research results. Even though data and code availability do not always ensure perfect reproducibility (Hardwicke et al 2021;Chang and Li 2015), the ability to rerun the code on data increases trust in the results. Furthermore, the ability to inspect all the materials and raw results allows for independent evaluation of threats to external validity (Shadish, Campbell, and Cook 2001), such as under-or over-representation of certain demographics or specifics of materials used for intervention.…”
Section: Transparency and Openness Promotion Standardsmentioning
confidence: 99%