2022
DOI: 10.1145/3494522
|View full text |Cite
|
Sign up to set email alerts
|

A Survey on Task Assignment in Crowdsourcing

Abstract: Quality improvement methods are essential to gathering high-quality crowdsourced data, both for research and industry applications. A popular and broadly applicable method is task assignment that dynamically adjusts crowd workflow parameters. In this survey, we review task assignment methods that address: heterogeneous task assignment, question assignment, and plurality problems in crowdsourcing. We discuss and contrast how these methods estimate worker performance, and highlight potential challenges in their … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 39 publications
(11 citation statements)
references
References 155 publications
0
11
0
Order By: Relevance
“…The common solution is to select the N workers with the best performance, while the worker performance can be evaluated from historical worker data, or gold standard questions (questions with known answers) (Hettiachchi et al , 2022), or estimated by probabilistic inference algorithms such as expectation maximization (Whitehill et al , 2009). However, in addition to avg_p and best , the performance of aggregation results is significantly affected by other factors, for example, dist (a representation of the concept of group diversity).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The common solution is to select the N workers with the best performance, while the worker performance can be evaluated from historical worker data, or gold standard questions (questions with known answers) (Hettiachchi et al , 2022), or estimated by probabilistic inference algorithms such as expectation maximization (Whitehill et al , 2009). However, in addition to avg_p and best , the performance of aggregation results is significantly affected by other factors, for example, dist (a representation of the concept of group diversity).…”
Section: Methodsmentioning
confidence: 99%
“…Li and Liu (2015) pointed out that worker performance and variance are important factors for classification task. Hettiachchi et al (2022) reviewed the existing worker selection models. However, the existing work does not research how to select the optimal working group with a given number of workers to optimize the performance of the aggregated results.…”
Section: Related Workmentioning
confidence: 99%
“…Again, we will focus on summarizing those that are most closely related to ours. Some papers, such as [Hettiachchi et al(2022)], [Zhang et al(2017)] study how to assign different crowd workers to multiple tasks in order to maximize the expected accuracy of labels.…”
Section: Crowdsourcingmentioning
confidence: 99%
“…2015); task assignment based on individual difference measures (Hettiachchi et al. 2020). Building on earlier work by Kahneman and colleagues, a number of studies have also addressed biases in crowdsourced judgments, for example, Eickhoff (2018) and Draws et al.…”
Section: How Can the Ai/ml Community Deal With Noise?mentioning
confidence: 99%