Proceedings of the 25th ACM Conference on Hypertext and Social Media 2014
DOI: 10.1145/2631775.2631819
|View full text |Cite
|
Sign up to set email alerts
|

A taxonomy of microtasks on the web

Abstract: Nowadays, a substantial number of people are turning to crowdsourcing, in order to resolve tasks that require human intervention. Despite a considerable amount of research done in the field of crowdsourcing, existing works fall short when it comes to classifying typically crowdsourced tasks. Understanding the dynamics of the tasks that are crowdsourced and the behaviour of workers, plays a vital role in efficient task-design. In this paper, we propose a two-level categorization scheme for tasks, based on an ex… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
60
0
3

Year Published

2015
2015
2023
2023

Publication Types

Select...
3
3
2

Relationship

2
6

Authors

Journals

citations
Cited by 102 publications
(63 citation statements)
references
References 8 publications
0
60
0
3
Order By: Relevance
“…To analyze the influence of different design considerations with respect to UI elements on worker performance, and their interplay with varying worker environments, we manually created a batch of 129 microtasks accounting for each of the 43 variations (each variation × 3 tasks), shown in the table on the companion webpage.These tasks consist of different types; information finding, verification and validation, interpretation and analysis, content creation, surveys and content access [21]. The table in the companion page also presents sample tasks that we created corresponding to each of the UI element variations; these tasks are noticeably designed to reflect real-world microtasks that have previously been deployed on crowdsourcing platforms.…”
Section: Methodology and Task Designmentioning
confidence: 99%
See 1 more Smart Citation
“…To analyze the influence of different design considerations with respect to UI elements on worker performance, and their interplay with varying worker environments, we manually created a batch of 129 microtasks accounting for each of the 43 variations (each variation × 3 tasks), shown in the table on the companion webpage.These tasks consist of different types; information finding, verification and validation, interpretation and analysis, content creation, surveys and content access [21]. The table in the companion page also presents sample tasks that we created corresponding to each of the UI element variations; these tasks are noticeably designed to reflect real-world microtasks that have previously been deployed on crowdsourcing platforms.…”
Section: Methodology and Task Designmentioning
confidence: 99%
“…A taxonomy of task types in microtask crowdsourcing platforms has been developed in [21] where a two-level structure with 6 categories at the top level has been proposed. In our work we leverage such top level categorization to compare the effects of work environments on different types of tasks.…”
Section: Task Types In Microtask Crowdsourcingmentioning
confidence: 99%
“…We take several precautions in order to avoid introducing any bias due to poor task design. We refer the reader to our work for further details [4,5].…”
Section: Approach and Methodologymentioning
confidence: 99%
“…We analyze the responses from the trustworthy workers in order to propose a taxonomy of microtasks [4]. By studying the responses from the untrustworthy workers in contrast to trustworthy workers, we identify the behavioral patterns that workers exhibit in crowdsourced surveys [5].…”
Section: Approach and Methodologymentioning
confidence: 99%
“…In these two-levels the authors classified micro tasks or work performed by contributors into high-level categories and then divided each type into subcategories [6]. Parshotam provided a working definition of crowdsourcing by reviewing five distinct applications and demonstrating the differences between them.…”
Section: Introductionmentioning
confidence: 99%