2018
DOI: 10.1007/s13164-018-0400-9
|View full text |Cite|
|
Sign up to set email alerts
|

Estimating the Reproducibility of Experimental Philosophy

Abstract: Responding to recent concerns about the reliability of the published literature in psychology and other disciplines, we formed the X-Phi Replicability Project (XRP) to estimate the reproducibility of experimental philosophy (osf.io/dvkpr). Drawing on a representative sample of 40 x-phi studies published between 2003 and 2015, we enlisted 20 research teams across 8 countries to conduct a high-quality replication of each study in order to compare the results to the original published findings. We found that x-ph… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

7
109
0
2

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
3
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 136 publications
(131 citation statements)
references
References 83 publications
7
109
0
2
Order By: Relevance
“…Chapter 5 (“Eight Defenses of the Method of Cases”) rebuts eight responses in defense of the method of cases. I defend the experimental competencies of experimental philosophers, in line with recent results suggesting that experimental philosophers’ findings replicate well (Cova et al., forthcoming) and do not suffer from various biases (Colombo, Duev, Nuijten, and Sprenger, ; Stuart, Colaço, & Machery, ms). I also provide evidence that reflection does not influence the judgments made in response to philosophical cases (see also Colaço, Kneer, Alexander, & Machery, ms).…”
Section: Answering Objectionssupporting
confidence: 72%
“…Chapter 5 (“Eight Defenses of the Method of Cases”) rebuts eight responses in defense of the method of cases. I defend the experimental competencies of experimental philosophers, in line with recent results suggesting that experimental philosophers’ findings replicate well (Cova et al., forthcoming) and do not suffer from various biases (Colombo, Duev, Nuijten, and Sprenger, ; Stuart, Colaço, & Machery, ms). I also provide evidence that reflection does not influence the judgments made in response to philosophical cases (see also Colaço, Kneer, Alexander, & Machery, ms).…”
Section: Answering Objectionssupporting
confidence: 72%
“…Such investigations provide insight into what researchers are doing well and what could be done to improve research and reporting practices in future studies. This complements direct assessments of replicability, such as the XPhi Replicability Project, a recent large-scale effort to reproduce central Experimental Philosophy findings (Cova et al 2018 https://osf.io/dvkpr/), which has provided encouraging data about current levels of replication in the field. We should not be complacent, though: Ensuring continued replicability requires the consistent adoption of appropriate reporting practices.…”
Section: Discussionmentioning
confidence: 71%
“…Notably, as a field, Experimental Philosophy seems aware that reproducibility can and should be monitored, with both organized replication projects (Cova et al 2018) and online resources tracking replicability (http://experimentalphilosophy.yale.edu/xphipage/Experimental%20Philosophy-Replications.html).…”
mentioning
confidence: 99%
“…We use the term "many-to-one design" to refer generically to any design in which an original study is replicated in multiple sites. Many-to-one replication research is a nascent, but rapidly expanding, field: we are aware of at least 79 completed and 55 ongoing many-to-one replication studies to date, all completed or initiated since 2014 and in experimental psychology and experimental philosophy alone (completed: [2,11,16,23,31,33,42,57,89,118]; ongoing: [5,32,58,88]). …”
Section: Introductionmentioning
confidence: 99%