2019
DOI: 10.25300/misq/2019/13649
|View full text |Cite
|
Sign up to set email alerts
|

Managing the Crowds: The Effect of Prize Guarantees and In-Process Feedback on Participation in Crowdsourcing Contests

Abstract: Crowdsourcing contests are contests by which organizations tap into the wisdom of crowds by outsourcing tasks to large groups of people on the Internet. In an online environment often characterized by anonymity and lack of trust, there are inherent uncertainties for participants of such contests. This study focuses on crowdsourcing contests with winner-take-all prizes. During these contests, submissions are made sequentially and contest hosts can provide public in-process feedback to the submissions as soon as… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
101
0

Year Published

2019
2019
2025
2025

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 70 publications
(106 citation statements)
references
References 22 publications
1
101
0
Order By: Relevance
“…The present research contributes to the extant literature in several ways. First, it advances the literature about the impacts of seeker involvements (e.g., the feedback, exemplars, and prizes that seekers provide) in crowdsourcing ideation contests (Jian et al Forthcoming;Koh Forthcoming;Wooten and Ulrich 2017). Moreover, our focus on how seeker exemplars shape solver behaviors in different Search and Evaluate activities enriches the understanding of seekers' influences on solvers in specific ideation activities.…”
Section: Introductionmentioning
confidence: 76%
See 1 more Smart Citation
“…The present research contributes to the extant literature in several ways. First, it advances the literature about the impacts of seeker involvements (e.g., the feedback, exemplars, and prizes that seekers provide) in crowdsourcing ideation contests (Jian et al Forthcoming;Koh Forthcoming;Wooten and Ulrich 2017). Moreover, our focus on how seeker exemplars shape solver behaviors in different Search and Evaluate activities enriches the understanding of seekers' influences on solvers in specific ideation activities.…”
Section: Introductionmentioning
confidence: 76%
“…At the end of the contests, seekers choose the ideas that they want to acquire and award the contest prizes to the corresponding solvers. Prior research finds that information that seekers provide in their contests has strategic impacts on how and what solvers ideate (Jian et al Forthcoming;Koh Forthcoming;Lee et al 2018;Wooten and Ulrich 2017). For example, in many contests, seekers often show examples of solutions that they like (see Figure 1), and these solution exemplars can affect the ideas that the solvers generate (Koh Forthcoming).…”
Section: Introductionmentioning
confidence: 99%
“…Wooten and Ulrich [25] compared the impact of no feedback, random feedback, and directional feedback on the participation of solvers through a field experiment on the crowdsourcing contest platform and found that the directional feedback helps to improve the average quality of the solutions. Jian et al [26] found that the amount of feedback in the crowdsourcing process had a positive impact on the number of solutions submitted. Jiang et al [27] empirically tested the impact of feedback on the results of the crowdsourcing contest and found that feedback positively affects solvers' participation.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Amazon Mechanical Turk HITs) to much more complex problems including new idea generation and R&D problems, which create new knowledge for the company [2]. Many crowdsourcing activities are organized as contests and these contests can be hosted on crowdsourcing platforms (e. g. TopCoder and Kaggle) [3]. Crowdsourcing contests have emerged as an innovative way for companies to solve business problems and have enabled them to have access to the knowledge of the crowd external to the firm [4].…”
Section: Introductionmentioning
confidence: 99%
“…But, none of these studies have examined how the structural capital (the number of times team members have teamed up) affect individuals' sustained participation. Moreover, the crowdsourcing platforms host contests from different organizations which can have effect on individuals' participation behavior in these platforms [3]. There are few studies that have explored the effect of environment-specific factors on individuals' sustained participation, and fewer still have examined how organization-specific factors affect individuals' sustained participation in crowdsourcing contests.…”
Section: Introductionmentioning
confidence: 99%