2011
DOI: 10.1145/2034863.2034873
|View full text |Cite
|
Sign up to set email alerts
|

Repeatability and workability evaluation of SIGMOD 2011

Abstract: SIGMOD has offered, since 2008, to verify the experiments published in the papers accepted at the conference. This year, we have been in charge of reproducing the experiments provided by the authors (repeatability), and exploring changes to experiment parameters (workability). In this paper, we assess the SIGMOD repeatability process in terms of participation, review process and results. While the participation is stable in terms of number of submissions, we find this year a sharp contrast between the high par… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
18
0

Year Published

2012
2012
2019
2019

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 24 publications
(18 citation statements)
references
References 3 publications
0
18
0
Order By: Relevance
“…Advocates of reproducibility have grown over the years in many disciplines, from signal processing [Vandewalle et al, 2009] to computational harmonic analysis [Donoho et al, 2009] to psychology [Spies et al, 2012]. Organized community efforts include reproducibility tracks at conferences [Manolescu et al, 2008;Bonnet et al, 2011;Wilson et al, 2012], reproducibility editors in journals [Diggle and Zeger, 2009;Peng, 2009], and numerous community workshops and forums [e.g., Bourne et al, 2011]. Repositories of shared computational workflows enable scientists to reuse workflows published by others and facilitate reproducibility, although these repositories do not yet have significant uptake in geosciences [De Roure et al, 2009; Earth and Space Science 10.1002/2015EA000136 Missier et al, 2010;Garijo et al, 2014].…”
Section: The Reproducibility Crisismentioning
confidence: 99%
“…Advocates of reproducibility have grown over the years in many disciplines, from signal processing [Vandewalle et al, 2009] to computational harmonic analysis [Donoho et al, 2009] to psychology [Spies et al, 2012]. Organized community efforts include reproducibility tracks at conferences [Manolescu et al, 2008;Bonnet et al, 2011;Wilson et al, 2012], reproducibility editors in journals [Diggle and Zeger, 2009;Peng, 2009], and numerous community workshops and forums [e.g., Bourne et al, 2011]. Repositories of shared computational workflows enable scientists to reuse workflows published by others and facilitate reproducibility, although these repositories do not yet have significant uptake in geosciences [De Roure et al, 2009; Earth and Space Science 10.1002/2015EA000136 Missier et al, 2010;Garijo et al, 2014].…”
Section: The Reproducibility Crisismentioning
confidence: 99%
“…As a result of this challenge, some authors proposed the use of virtual machines as a way of preserving the execution environment of an experiment [16,17]. Also, as part of the SIGMOD conference on 2011, a study was carried out to evaluate how a set of repeatability guidelines proposed to the authors submitting a paper (i.e., using virtual machines, pre-and post-conditions, and provenance-based workflow infrastructures) could help reviewers to reproduce the experiments described on the submitted paper [18].…”
Section: Current Approachesmentioning
confidence: 99%
“…Reproducibility and reusability are core scientific concepts, enabling knowledge transfer and independent research verification. Alarming reports concerning the failure to reproduce empirical studies in a variety of scientific fields [2,12,45] are leading to the development of services, tools and strategies that aim to support key reproducible research practices [60].…”
Section: Introductionmentioning
confidence: 99%