2021
DOI: 10.1609/hcomp.v9i1.18939
|View full text |Cite
|
Sign up to set email alerts
|

A Checklist to Combat Cognitive Biases in Crowdsourcing

Abstract: Recent research has demonstrated that cognitive biases such as the confirmation bias or the anchoring effect can negatively affect the quality of crowdsourced data. In practice, however, such biases go unnoticed unless specifically assessed or controlled for. Task requesters need to ensure that task workflow and design choices do not trigger workers’ cognitive biases. Moreover, to facilitate the reuse of crowdsourced data collections, practitioners can benefit from understanding whether and which cognitive bia… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
41
2

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 54 publications
(45 citation statements)
references
References 60 publications
2
41
2
Order By: Relevance
“…While it is challenging to employ identical participant constraints across MTurk and Prolific due to the varying platform features and dynamics, and considering the susceptibility of surveys (Gadiraju et al 2015), we aimed to control quality strictly to ensure reliable responses. We used the Cognitive Bias Checklist introduced by Draws et al (2021) to inform our survey design and limit potential cognitive biases from seeping through.…”
Section: Methodsmentioning
confidence: 99%
“…While it is challenging to employ identical participant constraints across MTurk and Prolific due to the varying platform features and dynamics, and considering the susceptibility of surveys (Gadiraju et al 2015), we aimed to control quality strictly to ensure reliable responses. We used the Cognitive Bias Checklist introduced by Draws et al (2021) to inform our survey design and limit potential cognitive biases from seeping through.…”
Section: Methodsmentioning
confidence: 99%
“…Crowd workers can perform as well as domain experts in certain tasks (Snow et al 2008), especially when the composition of the workers' pool is carefully curated in a task-dependant manner (Allen et al 2021). However, researchers have brought attention to the risk of annotators' cognitive bias (Eickhoff 2018;Draws et al 2021), stereotyping encoded in annotators' assessments (Otterbacher 2015), and uneven representations of demographic characteristics among annotators (Barbosa and Chen 2019). A number of factors, including task and instructions clarity (Wu and Quinn 2017), task design (Kazai et al 2011), incentives (Shah and Zhou 2015), and quality control mechanisms (Ipeirotis, Provost, and Wang 2010;McDonnell et al 2016), have been demonstrated to affect the quality of the annotations (Draws et al 2021).…”
Section: Related Workmentioning
confidence: 99%
“…However, researchers have brought attention to the risk of annotators' cognitive bias (Eickhoff 2018;Draws et al 2021), stereotyping encoded in annotators' assessments (Otterbacher 2015), and uneven representations of demographic characteristics among annotators (Barbosa and Chen 2019). A number of factors, including task and instructions clarity (Wu and Quinn 2017), task design (Kazai et al 2011), incentives (Shah and Zhou 2015), and quality control mechanisms (Ipeirotis, Provost, and Wang 2010;McDonnell et al 2016), have been demonstrated to affect the quality of the annotations (Draws et al 2021). Furthermore, even when labels are collected from domain experts, this does not mean they are free of bias.…”
Section: Related Workmentioning
confidence: 99%
“…Their findings reveal that workers with strong opinions tend to produce biased annotations and such bias should be mitigated to improve the quality of the data collected. In (Draws et al 2021), the authors propose a 12-item checklist adapted from business psychology to combat cognitive biases in crowdsourcing. We utilize this checklist to point out the potential biases in the data collected in our study out of the 12 items.…”
Section: Potential Biasesmentioning
confidence: 99%