Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval 2019
DOI: 10.1145/3331184.3331647
|View full text |Cite
|
Sign up to set email alerts
|

The SIGIR 2019 Open-Source IR Replicability Challenge (OSIRRC 2019)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5

Relationship

2
3

Authors

Journals

citations
Cited by 18 publications
(14 citation statements)
references
References 10 publications
0
14
0
Order By: Relevance
“…There are, in fact, many variants of the scoring function: beyond the original version proposed by Robertson et al [8], many variants exist that include small tweaks by subsequent researchers. Also, researchers using different IR systems report (sometimes quite) different effectiveness measurements for their implementation of BM25, even on the same test collections; consider for example the results reported in OSIRRC 2019, the open-source IR replicability challenge at SIGIR 2019 [2]. Furthermore, BM25 is parameterized in terms of k 1 and b (plus k 2 , k 3 in the original formulation), and researchers often neglect to include the parameter settings in their papers.…”
Section: Introductionmentioning
confidence: 99%
“…There are, in fact, many variants of the scoring function: beyond the original version proposed by Robertson et al [8], many variants exist that include small tweaks by subsequent researchers. Also, researchers using different IR systems report (sometimes quite) different effectiveness measurements for their implementation of BM25, even on the same test collections; consider for example the results reported in OSIRRC 2019, the open-source IR replicability challenge at SIGIR 2019 [2]. Furthermore, BM25 is parameterized in terms of k 1 and b (plus k 2 , k 3 in the original formulation), and researchers often neglect to include the parameter settings in their papers.…”
Section: Introductionmentioning
confidence: 99%
“…Although each artifact broke in its own idiosyncratic way, we do notice common themes. Given continued interest in reproducibility, most notably a new Docker-based iteration of OSIRRC in 2019 [3], 4 these "lessons learned" might form the basis of future best practices.…”
Section: Lessons Learnedmentioning
confidence: 99%
“…PRIMAD implicitly assumes a static view of reproducibility, as opposed to the processoriented view we advocate. More concretely, from the successes as well as failures in replicating OSIRRC 2015, we are able to extract a number of "lessons learned" that can be further distilled into best practices, especially to inform ongoing efforts such as the latest iteration of OSIRRC [3].…”
Section: Introductionmentioning
confidence: 99%
“…Workshops deal with the reproducibility either re-or proactively. For example, the CENTRE workshop [9] challenges participants to reconstruct IR systems and their results, whereas The Open-Source IR Replicability Challenge (OSIRRC) [7] motivated participants to package their retrieval systems and corresponding software dependencies in advance to prepare them for appropriate reuse.…”
Section: Related Workmentioning
confidence: 99%
“…The results of our experimental setups showed that we can replicate the outcomes fairly well, whereas reproduced outcomes are significantly lower. Having the reimplementation of an ad-hoc retrieval system at hand, we decided to contribute it to the OSIRRC@SIGIR2019 workshop [7]. All contributions resulted in an image library of Docker images to which we contributed the IRC-CENTRE2019 image [4].…”
Section: Preliminary Work and Research Proposalmentioning
confidence: 99%