2019
DOI: 10.31234/osf.io/jkws4
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Detection of data fabrication using statistical tools

Abstract: Scientific misconduct potentially invalidates findings in many scientific fields. Improved detection of unethical practices like data fabrication is considered to deter such practices. In two studies, we investigated the diagnostic performance of various statistical methods to detect fabricated quantitative data from psychological research. In Study 1, we tested the validity of statistical methods to detect fabricated data at the study level using summary statistics. Using (arguably) genuine data from the Many… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
13
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(13 citation statements)
references
References 64 publications
(100 reference statements)
0
13
0
Order By: Relevance
“…Early postsubmission honesty assessment is flawed because current tools for 'after-thefact' integrity checking have generally not been validated (Khan, 2021b). In a robust evaluation comparing integral versus fabricated data (Hartgerink et al, 2019), the authors' concluded: 'we argue against widespread (possible automatic) application of these tools'. Instead of targeting submissions for early integrity assessments with doubtful tools, journals need to put their own house in order, for example by first getting their instructions to authors right regarding transparency (Malički et al, 2021).…”
Section: Lettermentioning
confidence: 99%
“…Early postsubmission honesty assessment is flawed because current tools for 'after-thefact' integrity checking have generally not been validated (Khan, 2021b). In a robust evaluation comparing integral versus fabricated data (Hartgerink et al, 2019), the authors' concluded: 'we argue against widespread (possible automatic) application of these tools'. Instead of targeting submissions for early integrity assessments with doubtful tools, journals need to put their own house in order, for example by first getting their instructions to authors right regarding transparency (Malički et al, 2021).…”
Section: Lettermentioning
confidence: 99%
“…The findings of Carlisle have ajor i plications and ra ifications for edical science and its publication systems [57]. The (mainly statistical) methods available for detecting false data, which are devised by Carlisle and others [58][59][60][61][62][63][64][65][66], have become rather visible and advanced during the past decade. Rushdi and Rushdi [4] have recently suggested a non-statistical method based on the premise that flawed data might be detected via the excessive inconsistencies it causes in a variant of Boolean Analysis called Qualitative Comparative Analysis (QCA) [4,[67][68][69][70][71][72][73][74][75][76].…”
Section: Appendix Appendix a On The Detection Of Fabricated False Flawed And Wrong Medical Datamentioning
confidence: 99%
“…Moreover, it pertains to the validity of the premise that medical science is up to its claim of being self-correcting. The literature abounds with excellent articles related to this topic [51][52][53][54][55][56][57][58][59][60][61][62][63][64][65][66][77][78][79][80][81][82][83][84][85][86][87][88].…”
Section: Appendix Appendix a On The Detection Of Fabricated False Flawed And Wrong Medical Datamentioning
confidence: 99%
“…Even before the COVID-19 era, there have been many methods (mainly statistical) for detecting false data [27][28][29][30][31][32][33][34][35][36][37], that have become rather visible and advanced during the past decade. Rushdi and Rushdi [38,39] have recently suggested methods for avoiding probabilistic fallacies in medical context.…”
Section: Introductionmentioning
confidence: 99%