2020
DOI: 10.1007/978-3-030-45442-5_75
|View full text |Cite
|
Sign up to set email alerts
|

Living Labs for Academic Search at CLEF 2020

Abstract: The need for innovation in the field of academic search and IR, in general, is shown by the stagnating system performance in controlled evaluation campaigns, as demonstrated in TREC and CLEF meta-evaluation studies, as well as user studies in real systems of scientific information and digital libraries. The question of what constitutes relevance in academic search is multi-layered and a topic that drives research communities for years. The Living Labs for Academic Search (LiLAS) workshop has the goal to inspir… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
3
1

Relationship

3
1

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 7 publications
0
4
0
Order By: Relevance
“…As initially mentioned, intermediate evaluation in a living lab setting is especially challenging because of the lacking test collection. To provide a starting point for the LiLAS lab, Schaer et al provided head queries and candidate documents from the two real-world academic search systems, allowing the construction of pseudo test collections [21]. Pseudo test collections are a long-established method to create synthetic queries, and relevance judgments [6].…”
Section: Related Workmentioning
confidence: 99%
“…As initially mentioned, intermediate evaluation in a living lab setting is especially challenging because of the lacking test collection. To provide a starting point for the LiLAS lab, Schaer et al provided head queries and candidate documents from the two real-world academic search systems, allowing the construction of pseudo test collections [21]. Pseudo test collections are a long-established method to create synthetic queries, and relevance judgments [6].…”
Section: Related Workmentioning
confidence: 99%
“…It is integrated in these online academic search engines, which enables participants to evaluate their experimental retrieval systems (e.g., domainspecific research data recommendations based on publications [26]) with real users. STELLA will be part of the Living Labs for Academic Search (LiLAS) lab at CLEF 2020 [25] 10 .…”
Section: A Novel Concept For Evaluation Infrastructuresmentioning
confidence: 99%
“…After LiLAS ran as a workshop lab at CLEF 2020 [12,13], in 2021 a full evaluation lab will take place. This lab's unique selling point is that we offer two tasks to test this approach in two different academic search domains and evaluation setups.…”
Section: Introductionmentioning
confidence: 99%