Proceedings of the 2015 ACM SIGMOD International Conference on Management of Data 2015
DOI: 10.1145/2723372.2723731
|View full text |Cite
|
Sign up to set email alerts
|

Minimizing Efforts in Validating Crowd Answers

Abstract: In recent years, crowdsourcing has become essential in a wide range of Web applications. One of the biggest challenges of crowdsourcing is the quality of crowd answers as workers have wide-ranging levels of expertise and the worker community may contain faulty workers. Although various techniques for quality control have been proposed, a post-processing phase in which crowd answers are validated is still required. Validation is typically conducted by experts, whose availability is limited and who incur high co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
34
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 54 publications
(34 citation statements)
references
References 44 publications
0
34
0
Order By: Relevance
“…Interactive crowdsourcing applications [42] might require very fast response time. However, simple divideand-conquer methods such as matrix partitioning [43] are not applicable, since a split-up of the answer matrix causes information loss on worker communities and item clusters.…”
Section: Scalable Model Inference and Predictionmentioning
confidence: 99%
See 2 more Smart Citations
“…Interactive crowdsourcing applications [42] might require very fast response time. However, simple divideand-conquer methods such as matrix partitioning [43] are not applicable, since a split-up of the answer matrix causes information loss on worker communities and item clusters.…”
Section: Scalable Model Inference and Predictionmentioning
confidence: 99%
“…Most answer aggregation algorithms operate in batch mode; hence, the aggregated answers would be recomputed from scratch every time a new worker answer arrives. There are only a few approaches on incremental answer aggregation, such as online EM [56]-which targets incremental updates when a new answer arrives and i-EM [43], [57]-which targets incremental updates whenever the ground truth is extended. However, tailoring such incremental methods for our setting is non-trivial due to the dependency between labels and the community modelling of workers and items; e.g.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…These sorted sources are then combined following a greedy approach (lines [15][16][17][18][19][20][21][22][23][24][25][26], that gives preference to those sources with the highest estimated precision, and stops when the threshold is met (lines [17][18][19][20][21][22]. If the threshold is not yet reached by adding the new sources to the subset of candidate sources S , we compute a new cut-off, to divide those sources that will be considered in a potential solution from those that will not (lines 16 and 23-25).…”
Section: Algorithmmentioning
confidence: 99%
“…There has been significant recent interest in targeting the most effective feedback, in particular for crowdsourcing (e.g. [11,21,23,28,35]). …”
Section: Introductionmentioning
confidence: 99%