2018
DOI: 10.1177/2515245917747646
|View full text |Cite|
|
Sign up to set email alerts
|

Many Analysts, One Data Set: Making Transparent How Variations in Analytic Choices Affect Results

Abstract: This article was originally submitted for publication to the Editor of Advances in Methods and Practices in Psychological Science (AMPPS) in 2015. When the submitted manuscript was subsequently posted online (Silberzahn et al., 2015), it received some media attention, and two of the authors were invited to write a brief commentary in Nature advocating for greater crowdsourcing of data analysis by scientists. This commentary, arguing that crowdsourced research "can balance discussions, validate findings and bet… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

8
547
0
2

Year Published

2018
2018
2023
2023

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 664 publications
(665 citation statements)
references
References 40 publications
8
547
0
2
Order By: Relevance
“…Simulation studies have shown that these differences in analytic choices can have substantial effects on results 3 , but it has not been clear to what degree such variability exists and how it affects reported scientific conclusions in practice. Recent work in psychology has attempted to address this through a "many analysts" approach 4 , in which the same dataset was analyzed by a large number of groups, uncovering substantial variability in behavioral results across analysis teams.…”
Section: Resultsmentioning
confidence: 99%
“…Simulation studies have shown that these differences in analytic choices can have substantial effects on results 3 , but it has not been clear to what degree such variability exists and how it affects reported scientific conclusions in practice. Recent work in psychology has attempted to address this through a "many analysts" approach 4 , in which the same dataset was analyzed by a large number of groups, uncovering substantial variability in behavioral results across analysis teams.…”
Section: Resultsmentioning
confidence: 99%
“…It is common for researchers to begin a study with a general sense of how the methodology will be implemented, how the hypotheses will be tested, what exclusion rules will be applied, how variables will be combined, and which model form, covariates, and characteristics will be used. However, "a general sense" inevitably provides flexibility in making consequential decisions that could influence study execution, analysis, and reporting (Silberzahn et al, 2018;Simmons, Nelson, & Simonsohn, 2011). Effective preregistration requires converting that general sense into precise, explicit plans that anticipate what has not yet occurred and decisions about what will be done.…”
Section: Preregistration Improves With Practicementioning
confidence: 99%
“…A more extreme version of this robust modeling approach uses multiple different models that formalize the same psychological theory (Dutilh et al, 2018). Analogously to the "many analysts" approach in data analysis, the goal of this approach is to test the variation in findings arising from different researchers tackling the same problem using different reasonable methods (Silberzahn et al, 2018). If different models converge on the same findings, it suggests the models capture the theory and the inferences are robust.…”
Section: Making Modeling Robustmentioning
confidence: 99%