2016
DOI: 10.1561/1500000051
|View full text |Cite
|
Sign up to set email alerts
|

Online Evaluation for Information Retrieval

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
27
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 101 publications
(27 citation statements)
references
References 173 publications
0
27
0
Order By: Relevance
“…By following certain examination assumptions, click models [5,10,33] can simulate user behavior and achieve promising performance in alleviating position bias, click prediction and relevance estimation. Click through data has also been used as a relevance signal to train learning to match or learning to rank models [1,16,21] and as a signal for evaluation [15].…”
Section: Related Workmentioning
confidence: 99%
“…By following certain examination assumptions, click models [5,10,33] can simulate user behavior and achieve promising performance in alleviating position bias, click prediction and relevance estimation. Click through data has also been used as a relevance signal to train learning to match or learning to rank models [1,16,21] and as a signal for evaluation [15].…”
Section: Related Workmentioning
confidence: 99%
“…Even if items are presented at random, users can choose to provide, or not, item feedback. Note that other context biases can influence users, such as presentation and position bias [7].…”
Section: Why Share Rs Environments?mentioning
confidence: 99%
“…Missing Data. Datasets can be missing unobserved confounding variables which can lead to biased results [5,7]. Moreover, it is natural for users to interact with only a small portion of available items, hence datasets really only represent an incomplete picture of user preferences (such as, only bandit feedback [24]).…”
Section: Why Share Rs Environments?mentioning
confidence: 99%
See 2 more Smart Citations