2023
DOI: 10.1098/rsos.221306
|View full text |Cite
|
Sign up to set email alerts
|

Replication of the natural selection of bad science

Abstract: This study reports an independent replication of the findings presented by Smaldino and McElreath (Smaldino, McElreath 2016 R. Soc. Open Sci. 3 , 160384 ( doi:10.1098/rsos.160384 )). The replication was successful with one exception. We find that selection acting on scientist’s propensity for replication frequency caused a brief period of exuberant replication not observed in the original paper due to a coding error. This difference does not, however,… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
4
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 27 publications
0
4
0
Order By: Relevance
“…The model demonstrates that this way of conducting research, called "bad science", can rapidly spread through a group of laboratories, and become its dominant approach. The main results of the model were recently replicated [33], and its framework has been used to conduct virtual experiments of interventions such as auditing of research facilities, making publication of negative results more prestigious, improving peer review, assigning research funds randomly or according to methodological integrity, and researchers expending effort on the selection of strong hypotheses through theory development [34][35][36].…”
Section: Plos Onementioning
confidence: 99%
“…The model demonstrates that this way of conducting research, called "bad science", can rapidly spread through a group of laboratories, and become its dominant approach. The main results of the model were recently replicated [33], and its framework has been used to conduct virtual experiments of interventions such as auditing of research facilities, making publication of negative results more prestigious, improving peer review, assigning research funds randomly or according to methodological integrity, and researchers expending effort on the selection of strong hypotheses through theory development [34][35][36].…”
Section: Plos Onementioning
confidence: 99%
“…Although simulation studies are a powerful tool for methodological research, results from those studies, as the example of the 'one-in-ten rule' illustrates, are not definitive. Like empirical results, results from simulation studies need to be reproduced and replicated to verify their veracity [2], and this is increasingly called for [8][9][10]. So far, there is limited evidence on the reproducibility or replicability of simulation studies.…”
Section: Introductionmentioning
confidence: 99%
“…The paper's conclusions remain unaffected after correcting for these errors. For further discussion, see [1], which also presents a reimplementation of the model in the R programming language, archived in the Software Heritage archive at https://archive. softwareheritage.org/swh:1:snp:60ab9f391840fbb0d226fdbce35169b 271e00918;origin=https://gitlab.com/fkohrt/bachelorarbeit-code.…”
mentioning
confidence: 99%
“…The paper's conclusions remain unaffected after correcting for these errors. For further discussion, see [ 1 ], which also presents a reimplementation of the model in the R programming language, archived in the Software Heritage archive at .…”
mentioning
confidence: 99%