2022 ACM Conference on Fairness, Accountability, and Transparency 2022
DOI: 10.1145/3531146.3534629
|View full text |Cite
|
Sign up to set email alerts
|

The Effects of Crowd Worker Biases in Fact-Checking Tasks

Abstract: Due to the increasing amount of information shared online every day, the need for sound and reliable ways of distinguishing between trustworthy and non-trustworthy information is as present as ever. One technique for performing fact-checking at scale is to employ human intelligence in the form of crowd workers. Although earlier work has suggested that crowd workers can reliably identify misinformation, cognitive biases of crowd workers may reduce the quality of truthfulness judgments in this context. We perfor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(10 citation statements)
references
References 44 publications
0
10
0
Order By: Relevance
“…Our result indicates that crowd annotation in subjective tasks is to a little extent affected by users' backgrounds-in our case, their news expertise, aligning with prior works [22,33,64]. Therefore, we can train users using their news expertise as a targeting criterion.…”
Section: Rq1: Annotation Performancementioning
confidence: 60%
See 1 more Smart Citation
“…Our result indicates that crowd annotation in subjective tasks is to a little extent affected by users' backgrounds-in our case, their news expertise, aligning with prior works [22,33,64]. Therefore, we can train users using their news expertise as a targeting criterion.…”
Section: Rq1: Annotation Performancementioning
confidence: 60%
“…Example Response Perspectives and Biases (22) They were taking different sides of the equation and putting forward different thought processes Information Placement or Depth (14) The right article was less descriptive and focused more on the restrictions and not the case. It was definitely telling the story from one point of view.…”
Section: Theme(n)mentioning
confidence: 99%
“…Building on earlier work by Kahneman and colleagues, a number of studies have also addressed biases in crowdsourced judgments, for example, Eickhoff (2018) and Draws et al. (2022). Based on proposals by Kahneman, Lovallo, and Sibony (2011) for reducing biases in individuals and small groups Draws et al.…”
Section: How Can the Ai/ml Community Deal With Noise?mentioning
confidence: 99%
“…A number of studies have addressed psychological factors in crowdsourcing, such as effects of priming (Harrison et al 2013) and (Morris, Dontcheva, and Gerber 2012); task design (Alagarai, Rajeshuni, and Indurkhya 2014); task breaks (Dai et al 2015); task assignment based on individual difference measures (Hettiachchi et al 2020). Building on earlier work by Kahneman and colleagues, a number of studies have also addressed biases in crowdsourced judgments, for example, Eickhoff (2018) and Draws et al (2022). Based on proposals by Kahneman, Lovallo, and Sibony (2011) for reducing biases in individuals and small groups Draws et al (2021) developed a checklist for reducing cognitive biases in crowdsourcing.…”
Section: Crowdsourcing Human Computation and Noisementioning
confidence: 99%
“…These organizations are often centralized, meaning that they are controlled by a small group of people. This can make it difficult for them to guarantee transparency and prevent manipulation, such as cherry-picking in favor of a particular party [5,15,54]. In addition, "fact-checking" is labor-intensive, and the curators may not be able to keep up with emerging news stories or cover the entire news ecosystem.…”
Section: Introductionmentioning
confidence: 99%