2015
DOI: 10.1093/pan/mpv006
|View full text |Cite
|
Sign up to set email alerts
|

Underreporting in Political Science Survey Experiments: Comparing Questionnaires to Published Results

Abstract: The accuracy of published findings is compromised when researchers fail to report and adjust for multiple testing. Preregistration of studies and the requirement of preanalysis plans for publication are two proposed solutions to combat this problem. Some have raised concerns that such changes in research practice may hinder inductive learning. However, without knowing the extent of underreporting, it is difficult to assess the costs and benefits of institutional reforms. This paper examines published survey ex… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
29
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 34 publications
(29 citation statements)
references
References 22 publications
0
29
0
Order By: Relevance
“…A failure to disclose all outcome variables seems relatively common in the social sciences. Franco et al (2015Franco et al ( , 2016 found that 60% (N=53) of the experimental political science papers and 70% (N=32) of experimental psychology papers they assessed reported fewer outcome variables in the published study compared to the number of outcome variables that were actually included in the study.…”
Section: Questionable Research Practicesmentioning
confidence: 99%
See 1 more Smart Citation
“…A failure to disclose all outcome variables seems relatively common in the social sciences. Franco et al (2015Franco et al ( , 2016 found that 60% (N=53) of the experimental political science papers and 70% (N=32) of experimental psychology papers they assessed reported fewer outcome variables in the published study compared to the number of outcome variables that were actually included in the study.…”
Section: Questionable Research Practicesmentioning
confidence: 99%
“…The primary way communication researchers have examined the prevalence of QRPs is through content analysis (Matthes et al, 2015;Vermeulen et al, 2015). This is certainly a valid approach for identifying certain QRPs such as publication bias, non-publication of studies or outcome variables, and rounding (Franco et al, 2014(Franco et al, , 2015Hartgerink et al, 2016). But other QRPs, such as imputing missing data, peeking at data, running additional analyses, are harder to isolate with a content analysis of the published literature as these QRPs are generally not disclosed in the paper.…”
Section: Self-reported Questionable Research Practicesmentioning
confidence: 99%
“…Recently, scientific disciplines such as psychology [9], medicine [3], economics [6], and political science [16] have had many published research articles fail to replicate, i.e. rerunning the experiment with more statistical power does not yield an effect as strong as or in the same direction as the original experiment.…”
Section: Introductionmentioning
confidence: 99%
“…Examining the race discrimination literature, Zigerell (2017) demonstrated several cases of selective reporting in which alternative model specifications remained unreported when they led to different results. In a large-scale examination of 249 political science studies from a research competition for which data and research questions were known, Franco et al (2015) showed that only one out of five published studies reported all experimental conditions and outcome variables. The average study left 0.5 experimental conditions and 3.1 experimental outcomes undisclosed.…”
Section: Analytical Robustnessmentioning
confidence: 99%