Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume 2021
DOI: 10.18653/v1/2021.eacl-main.152
|View full text |Cite
|
Sign up to set email alerts
|

What Sounds “Right” to Me? Experiential Factors in the Perception of Political Ideology

Abstract: In this paper, we challenge the assumption that political ideology is inherently built into text by presenting an investigation into the impact of experiential factors on annotator perceptions of political ideology. We construct an annotated corpus of U.S. political discussion, where in addition to ideology labels for texts, annotators provide information about their political affiliation, exposure to political news, and familiarity with the source domain of discussion, Reddit. We investigate the variability i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(6 citation statements)
references
References 31 publications
0
6
0
Order By: Relevance
“…Joseph et al (2017) showed that how one constructs annotation tasks can significantly impact (supervised) model performance and one's assessment of it. Further, as demonstrated by Shen and Rose (2021) on the closely related task of inferring political ideology, annotator expertise and subjectivity also play an important role in the quality of annotated data.…”
Section: Stance Detection and Annotationmentioning
confidence: 99%
See 1 more Smart Citation
“…Joseph et al (2017) showed that how one constructs annotation tasks can significantly impact (supervised) model performance and one's assessment of it. Further, as demonstrated by Shen and Rose (2021) on the closely related task of inferring political ideology, annotator expertise and subjectivity also play an important role in the quality of annotated data.…”
Section: Stance Detection and Annotationmentioning
confidence: 99%
“…The present work complements these prior efforts by delving into other questions of annotator disagreement and inference. Whereas prior work has considered disagreement arising from task differences (Joseph et al, 2017), or properties of the annotators (Shen and Rose, 2021), we control for both of these factors, taking a single task and a relatively homogenous set of expert annotators. Instead, extending recent work studying prediction on multiple targets (van den Berg et al, 2019;, we study how agreement varies depending on the target selected, and how even within a single task design, annotators can come to rely on distinct subsets of information.…”
Section: Stance Detection and Annotationmentioning
confidence: 99%
“…However, the contextual information is local and does not take into account global subject content under discussion. Shen and Rose (2021) investigate the influence of annotators’ political beliefs and familiarity of the source towards annotations of identifying political orientation of Reddit posts. Chung et al (2019) present a dataset that collects both hateful and toxic messages along with potential repairs that can provide counter-narrative information with fact-based information and non-offensive language to de-escalate hateful discourse.…”
Section: Background and Related Workmentioning
confidence: 99%
“…Our findings from simulations provide directions for user experiments. Human perception -and thus human annotators' interpretation -is influenced by human factors such as preferences, cultural differences, bias, domain expertise, fatigue, time on task, or mood at annotation time (Alm, 2012;Amidei et al, 2020;Shen and Rose, 2021). Generally, experts with long-standing practice or in-depth knowledge may also not share consensus (Plank et al, 2014).…”
Section: Introductionmentioning
confidence: 99%