This study explores how researchers’ analytical choices affect the reliability of scientific findings. Most discussions of reliability problems in science focus on systematic biases. We broaden the lens to emphasize the idiosyncrasy of conscious and unconscious decisions that researchers make during data analysis. We coordinated 161 researchers in 73 research teams and observed their research decisions as they used the same data to independently test the same prominent social science hypothesis: that greater immigration reduces support for social policies among the public. In this typical case of social science research, research teams reported both widely diverging numerical findings and substantive conclusions despite identical start conditions. Researchers’ expertise, prior beliefs, and expectations barely predict the wide variation in research outcomes. More than 95% of the total variance in numerical results remains unexplained even after qualitative coding of all identifiable decisions in each team’s workflow. This reveals a universe of uncertainty that remains hidden when considering a single study in isolation. The idiosyncratic nature of how researchers’ results and conclusions varied is a previously underappreciated explanation for why many scientific hypotheses remain contested. These results call for greater epistemic humility and clarity in reporting scientific findings.
The GLES Open Science Challenge 2021 was a pioneering initiative in quantitative political science. Aimed at increasing the adoption of replicable and transparent research practices, it led to this special issue. The project combined the rigor of registered reports—a new publication format in which studies are evaluated prior to data collection/access and analysis—with quantitative political science research in the context of the 2021 German federal election. This special issue, which features the registered reports that resulted from the project, shows that transparent research following open science principles benefits our discipline and substantially contributes to quantitative political science. In this introduction to the special issue, we first elaborate on why more transparent research practices are necessary to guarantee the cumulative progress of scientific knowledge. We then show how registered reports can contribute to increasing the transparency of scientific practices. Next, we discuss the application of open science practices in quantitative political science to date. And finally, we present the process and schedule of the GLES Open Science Challenge and give an overview of the contributions included in this special issue.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.