Handbook of Human Computation 2013
DOI: 10.1007/978-1-4614-8806-4_30
|View full text |Cite
|
Sign up to set email alerts
|

Risks and Rewards of Crowdsourcing Marketplaces

Abstract: Crowdsourcing has become an increasingly popular means of flexibly deploying large amounts of human computational power. The present chapter investigates the role of microtask labor marketplaces in managing human and hybrid human machine computing. Labor marketplaces offer many advantages that in combination allow human intelligence to be allocated across projects rapidly and efficiently and information to be transmitted effectively between market participants. Human computation comes with a set of challenges … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
45
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 53 publications
(46 citation statements)
references
References 58 publications
1
45
0
Order By: Relevance
“…Several research studies have recommended using a HIT approval rate of 95% or greater (Berinsky, Huber, & Lenz, ; Chandler, Paolacci, & Mueller, ; Goodman et al, ; Paolacci, Chandler, & Ipeirotis, ; Peer et al, ). The HIT approval requirement of 1000 or more approved HITs was designed to have a meaningful base to determine the 95% HIT approval rate.…”
mentioning
confidence: 99%
“…Several research studies have recommended using a HIT approval rate of 95% or greater (Berinsky, Huber, & Lenz, ; Chandler, Paolacci, & Mueller, ; Goodman et al, ; Paolacci, Chandler, & Ipeirotis, ; Peer et al, ). The HIT approval requirement of 1000 or more approved HITs was designed to have a meaningful base to determine the 95% HIT approval rate.…”
mentioning
confidence: 99%
“…We distinguish the following sub-dimensions: -Description: This is the description of the work the requester asks the crowd to perform; it includes the instructions of how to perform the task and possible context information. The clarity and details of the description influence the way workers perform the task and hence the quality of its output [Chandler et al 2013]. -User interface: This is the software user interface workers use to perform the task.…”
Section: Crowdsourcing Quality Modelmentioning
confidence: 99%
“…Since the microtask paradigm in large scale crowdsourcing involves monotonous sequences of repetitive tasks, fatigue buildup can pose a potential problem to the quality of submitted work over time [9]. Furthermore, workers have been noted to be "satisficers" who, as they gain familiarity with the task and its acceptance thresholds, strive to do the minimal work possible to achieve these thresholds [8,51].…”
Section: Cscw '17mentioning
confidence: 99%