Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Langua 2021
DOI: 10.18653/v1/2021.naacl-main.295
|View full text |Cite
|
Sign up to set email alerts
|

Beyond Fair Pay: Ethical Implications of NLP Crowdsourcing

Abstract: The use of crowdworkers in NLP research is growing rapidly, in tandem with the exponential increase in research production in machine learning and AI. Ethical discussion regarding the use of crowdworkers within the NLP research community is typically confined in scope to issues related to labor conditions such as fair pay. We draw attention to the lack of ethical considerations related to the various tasks performed by workers, including labeling, evaluation, and production. We find that the Final Rule, the co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 40 publications
(28 citation statements)
references
References 34 publications
0
28
0
Order By: Relevance
“…There has been work on proposing guidelines for requesters (Sabou et al, 2014), incorporating workers into the IRB process (Libuše Hannah Vepřek, 1 Code for our experiments is attached to this paper arXiv:2105.12762v1 [cs.CL] 26 May 2021 2020), and developing tools to help workers address the power imbalance in the online workplace Silberman, 2013, 2016). Concurrent with this work, another study showed that crowdsourcing is being used more each year in NLP research, and there is limited awareness of the ethical issues in this type of work (Shmueli et al, 2021).…”
Section: Background and Related Workmentioning
confidence: 86%
“…There has been work on proposing guidelines for requesters (Sabou et al, 2014), incorporating workers into the IRB process (Libuše Hannah Vepřek, 1 Code for our experiments is attached to this paper arXiv:2105.12762v1 [cs.CL] 26 May 2021 2020), and developing tools to help workers address the power imbalance in the online workplace Silberman, 2013, 2016). Concurrent with this work, another study showed that crowdsourcing is being used more each year in NLP research, and there is limited awareness of the ethical issues in this type of work (Shmueli et al, 2021).…”
Section: Background and Related Workmentioning
confidence: 86%
“…Research has also been conducted to investigate annotation bias and annotator pools (Al Kuwatly et al, 2020;Waseem, 2016;Ross et al, 2017;Shmueli et al, 2021;Posch et al, 2018), as well as bias (especially racial) in existing datasets (Davidson et al, 2019b;Laugier et al, 2021). It was found that data can reflect and propagate annotator bias.…”
Section: Related Workmentioning
confidence: 99%
“…(1) There is a lot of variation in the researcher perception of when ethics review is needed (Shmueli et al, 2021). Santy et al (2021) showed that less than 0.8% of NLP studies published since 2006 have sought IRB approval, and that was mostly for data collection or annotation.…”
Section: Necessary For the Following Reasonsmentioning
confidence: 99%