Proceedings of the 38th International Conference on Software Engineering Companion 2016
DOI: 10.1145/2889160.2889225
|View full text |Cite
|
Sign up to set email alerts
|

Trustworthiness in enterprise crowdsourcing

Abstract: In this paper we study the trustworthiness of the crowd for crowdsourced software development. Through the study of literature from various domains, we present the risks that impact the trustworthiness in an enterprise context. We survey known techniques to mitigate these risks. We also analyze key metrics from multiple years of empirical data of actual crowdsourced software development tasks from two leading vendors. We present the metrics around untrustworthy behavior and the performance of certain mitigatio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 20 publications
(19 citation statements)
references
References 26 publications
0
19
0
Order By: Relevance
“…Future work includes additional methods to improve worker responses, such as contest type tasks [13] and alternative worker compensations. We will explore the use of automated extraction tools to complementing the crowdsourcing methodology.…”
Section: Discussionmentioning
confidence: 99%
“…Future work includes additional methods to improve worker responses, such as contest type tasks [13] and alternative worker compensations. We will explore the use of automated extraction tools to complementing the crowdsourcing methodology.…”
Section: Discussionmentioning
confidence: 99%
“…Data regarding workforce reliability (in terms of complete task submission and thus participation), task registration speed, task completion duration, skills (on programming language knowledge list) and challenge types (on the nature and rewards of each contest) were analyzed and signalized as catalyst factors for success. Similarly, Dwarakanath et al [8] assessed the crowd trustworthiness based on submission quality, timeliness and ownership concluding that task requirements, user efficacy and reputation strongly influence the trustworthiness of the crowd. Karim et al in [13] proposed a recommendation system that can help mainly crowd workers (and as a side effect providers) on taking over the appropriate tasks based on technology requirements and their skills.…”
Section: F How To Crowdsource?mentioning
confidence: 99%
“…In the literature [25], Dubey A et al analyzed crowd developer's characteristics on Topcoder.com and upwork.com to predict the quality of tasks completed by them. Furthermore, they put forward the trustworthiness of software crowdsourcing to help enterprises get rid of certain risks [26]. Alelyani T and Yang Y [27] discussed the reliability that different workers showed on different tasks, so as to find out certain behavioral characteristics of the crowd developers.…”
Section: Research On Crowd Workersmentioning
confidence: 99%