2016
DOI: 10.1371/journal.pbio.1002331
|View full text |Cite
|
Sign up to set email alerts
|

Where Have All the Rodents Gone? The Effects of Attrition in Experimental Research on Cancer and Stroke

Abstract: Given small sample sizes, loss of animals in preclinical experiments can dramatically alter results. However, effects of attrition on distortion of results are unknown. We used a simulation study to analyze the effects of random and biased attrition. As expected, random loss of samples decreased statistical power, but biased removal, including that of outliers, dramatically increased probability of false positive results. Next, we performed a meta-analysis of animal reporting and attrition in stroke and cancer… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
121
0
1

Year Published

2016
2016
2018
2018

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 107 publications
(122 citation statements)
references
References 27 publications
0
121
0
1
Order By: Relevance
“…A particularly problematic aspect of researcher flexibility is the decision to remove outliers after having seen their influence on the P-value. Selective removal of outliers has a high potential of generating biased results (Holman et al, 2016), so the removal of data points is generally discouraged. Publications should always explain the reasons behind any attrition (loss of data points between study initiation and data analysis) and should discuss whether the missed samples might have led to biased results.…”
Section: (E) Researcher Degrees Of Freedom: (2) Flexibility In Analysismentioning
confidence: 99%
“…A particularly problematic aspect of researcher flexibility is the decision to remove outliers after having seen their influence on the P-value. Selective removal of outliers has a high potential of generating biased results (Holman et al, 2016), so the removal of data points is generally discouraged. Publications should always explain the reasons behind any attrition (loss of data points between study initiation and data analysis) and should discuss whether the missed samples might have led to biased results.…”
Section: (E) Researcher Degrees Of Freedom: (2) Flexibility In Analysismentioning
confidence: 99%
“…Of considerable importance is the internal and external validity of this approach when broadened in its application [4, 71–73]. Poor internal validity has been identified as a major contributor to false positive results in pre-clinical studies [4, 5].…”
Section: A Proposal For a Pre-clinical Methodology That Incorporates mentioning
confidence: 99%
“…With numerous neuroprotectant compounds tested and failed [2, 3], the inability to predict successful clinical outcomes from apparently positive pre-clinical and early-phase clinical studies is all too common. While there are likely several explanations for these failures, including lack of power [4, 5], rigor in design and execution, and a disconnection between the pre-clinical design and human trial [6, 7], we believe there is a fundamental problem in the identification of effective agents due to the way clinical and pre-clinical studies are designed and analyzed for disorders heterogeneous with respect to the myriad of factors that influence outcome, including innate biological, environmental, and methodological variance and experimental noise [8]. …”
Section: Introductionmentioning
confidence: 99%
“…Of those that did report numbers, ≈30% reported that they had dropped rodents from their study analysis, but <25% of those explained why. Using simulated data, we demonstrated that this can lead to a major distortion of the results, 20 especially when group sizes are small.…”
Section: Can Mice Mimic Human Strokementioning
confidence: 99%