2019
DOI: 10.48550/arxiv.1912.11211
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Algorithmic Extremism: Examining YouTube's Rabbit Hole of Radicalization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
22
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(27 citation statements)
references
References 0 publications
1
22
0
Order By: Relevance
“…Meanwhile, their analysis of user comments suggests that users do migrate from milder to more extreme content over time. This is supported by Ledwich & Zaitsev [45], who find that YouTube's recommendation system actively discourages users from visiting extremist content. Their analysis suggests that YouTube directs traffic towards the two largest mainstream groups -the Partisan Right and the Partisan Left -away from more niche content they labeled Conspiracy, White Identitarian, and Anti-Social Justice Warrior.…”
Section: Youtube Recommendation and Topic Driftmentioning
confidence: 85%
“…Meanwhile, their analysis of user comments suggests that users do migrate from milder to more extreme content over time. This is supported by Ledwich & Zaitsev [45], who find that YouTube's recommendation system actively discourages users from visiting extremist content. Their analysis suggests that YouTube directs traffic towards the two largest mainstream groups -the Partisan Right and the Partisan Left -away from more niche content they labeled Conspiracy, White Identitarian, and Anti-Social Justice Warrior.…”
Section: Youtube Recommendation and Topic Driftmentioning
confidence: 85%
“…Overall, 14 studies implicated the YouTube recommender system in facilitating problematic content pathways ( Alfano et al, 2020 ; AVAAZ Report, 2020 ; Chen et al, 2021 ; Hussein et al, 2020; Matamoros-Fernández, 2017; O’Callaghan et al, 2013 ; O’Callaghan et al, 2015 ; Papadamou et al, 2020 ; Papadamou et al, 2020 ; Röchert, Weitzel, & Ross, 2020 ; Schmitt et al, 2018 ; Song & Gruzd, 2017 ; Spinelli & Crovella, 2020). Two studies did not implicate the YouTube recommender system in facilitating problematic content pathways ( Hosseinmardi et al, 2020 ; Ledwich & Zaitsev, 2019 ). Finally, seven studies produced mixed results ( Abul-Fottouh, Song, & Gruzd, 2020 ; Faddoul, Chaslot, & Farid, 2020 ; Kaiser & Rauchfleisch, 2019 ; Kaushal et al, 2016 ; Papadamou et al, 2020 ; Ribeiro et al, 2020 ).…”
Section: Resultsmentioning
confidence: 99%
“…Those studies are thus described in other sub-sections. For example, Ledwich and Zaitsev (2019) included conspiratorial content as a proxy of radicalising content, so this study is discussed in the radicalising content section.…”
Section: Resultsmentioning
confidence: 99%
“…The selection of good models, we claim, however, benefits from inclusion of psychological grounding and extensive testing. content is more diverse than natural consumption [47], and that real systems do not exhibit the strong extremism or polarization effects implied by theoretical models [28,39,40,43,57]. The study [5] points out that recommendation systems might lead to undesired changes in preferences, and proposes to design for safe preference shifts, which are preference trajectories that are deemed "desirable".…”
Section: Related Workmentioning
confidence: 99%