2012
DOI: 10.1007/s10791-012-9206-z
|View full text |Cite
|
Sign up to set email alerts
|

Crowdsourcing interactions: using crowdsourcing for evaluating interactive information retrieval systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
37
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 48 publications
(37 citation statements)
references
References 8 publications
0
37
0
Order By: Relevance
“…As highlighted by Zuccon et al [54], crowdsourcing provides an alternative means for capturing user interactions and search behaviours from traditional lab-based user studies. Greater volumes of data can be obtained from more heterogeneous workers at a lower cost -all within a shorter timeframe.…”
Section: Crowdsourced Subjects and Ality Controlmentioning
confidence: 99%
See 1 more Smart Citation
“…As highlighted by Zuccon et al [54], crowdsourcing provides an alternative means for capturing user interactions and search behaviours from traditional lab-based user studies. Greater volumes of data can be obtained from more heterogeneous workers at a lower cost -all within a shorter timeframe.…”
Section: Crowdsourced Subjects and Ality Controlmentioning
confidence: 99%
“…Of course, pitfalls of a crowdsourced approach include the possibility of workers completing tasks as e ciently as possible, or submi ing their tasks without performing the requested operations [13]. Despite these issues, it has been shown that there is li le di erence in the quality between crowdsourced and lab-based studies [54]. Nevertheless, quality control is a major component of a well-executed crowdsourced experiment [5].…”
Section: Crowdsourced Subjects and Ality Controlmentioning
confidence: 99%
“…Nevertheless, due to the recent nature of this type of citizen participation initiatives, there is little empirical research about its effectiveness. In a recent study, Zuccon et al [78] have compared the effectiveness of crowdsourcing versus traditional methods (laboratory-based user studies) in the evaluation of interactive information retrieval systems. Their results show that crowdsourcing provides an opportunity of reaching a higher number of participants and provides results as valuable as those provided by traditional methods, at half the cost and collecting five times more data.…”
Section: Crowdsourcing: Models and Impact Assessmentmentioning
confidence: 99%
“…From a practical perspective, this means that managers have to decide whether they are more interested in the quantity of ideas or in the quality of the ideas. Zuccon et al [78] note that a more effective strategy is to combine both methodologies: traditional methods for pilot studies followed by crowdsourcing, which allows access to a larger number of potential participants.…”
Section: Crowdsourcing: Models and Impact Assessmentmentioning
confidence: 99%
“…A variety of recent work has investigated crowdsourcing methods for IR data collection [1], both for Cranfield-style relevance judging [35] and interactive IR studies [78]. Potential benefits include faster, easier, cheaper, and scalable data collection, increasing diversity of data available beyond that provided by traditional assessors and university students, and the potential greater similarity between the crowd and typical IR system users.…”
Section: Crowdsourcingmentioning
confidence: 99%