A growing number of people are working as part of on-line crowd work. Crowd work is often thought to be low wage work. However, we know little about the wage distribution in practice and what causes low/high earnings in this setting. We recorded 2,676 workers performing 3.8 million tasks on Amazon Mechanical Turk. Our task-level analysis revealed that workers earned a median hourly wage of only ~$2/h, and only 4% earned more than $7.25/h. While the average requester pays more than $11/h, lower-paying requesters post much more work. Our wage calculations are influenced by how unpaid work is accounted for, e.g., time spent searching for tasks, working on tasks that are rejected, and working on tasks that are ultimately not submitted. We further explore the characteristics of tasks and working patterns that yield higher hourly wages. Our analysis informs platform design and worker tools to create a more positive future for crowd work. Figure 12. (a) Hourly wage distributions of seven HIT categories provided by Gadiraju et al. [25] (with an additional category Research). (b) Strip plots showing median hourly wages of HITs associated with the topical keywords in Table 4.
By lowering the costs of communication, the web promises to enable distributed collectives to act around shared issues. However, many collective action efforts never succeed: while the web's affordances make it easy to gather, these same decentralizing characteristics impede any focus towards action. In this paper, we study challenges to collective action efforts through the lens of online labor by engaging with Amazon Mechanical Turk workers. Through a year of ethnographic fieldwork, we sought to understand online workers' unique barriers to collective action. We then created Dynamo, a platform to support the Mechanical Turk community in forming publics around issues and then mobilizing. We found that collective action publics tread a precariously narrow path between the twin perils of stalling and friction, balancing with each step between losing momentum and flaring into acrimony. However, specially structured labor to maintain efforts' forward motion can help such publics take action.
Crowdworkers regularly support their work with scripts, extensions, and software to enhance their productivity. Despite their evident significance, little is understood regarding how these tools affect crowdworkers' quality of life and work. In this study, we report findings from an interview study (N=21) aimed at exploring the tooling practices used by full-time crowdworkers on Amazon Mechanical Turk. Our interview data suggests that the tooling utilized by crowdworkers (1) strongly contributes to the fragmentation of microwork by enabling task switching and multitasking behavior; (2) promotes the fragmentation of crowdworkers' work-life boundaries by relying on tooling that encourages a 'work-anywhere' attitude; and (3) aids the fragmentation of social ties within worker communities through limited tooling access. Our findings have implications for building systems that unify crowdworkers' work practice in support of their productivity and well-being.
Crowdsourcing platforms are increasingly being harnessed for creative work. The platforms' potential for creative work is clearly identified, but the workers' perspectives on such work have not been extensively documented. In this paper, we uncover what the workers have to say about creative work on paid crowdsourcing platforms. Through a quantitative and qualitative analysis of a questionnaire launched on two different crowdsourcing platforms, our results revealed clear differences between the workers on the platforms in both preferences and prior experience with creative work. We identify common pitfalls with creative work on crowdsourcing platforms, provide recommendations for requesters of creative work, and discuss the meaning of our findings within the broader scope of creativity-oriented research. To the best of our knowledge, we contribute the first extensive worker-oriented study of creative work on paid crowdsourcing platforms.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.