2023
DOI: 10.1145/3610191
|View full text |Cite
|
Sign up to set email alerts
|

Hate Raids on Twitch: Understanding Real-Time Human-Bot Coordinated Attacks in Live Streaming Communities

Jie Cai,
Sagnik Chowdhury,
Hongyang Zhou
et al.

Abstract: Online harassment and content moderation have been well-documented in online communities. However, new contexts and systems always bring new ways of harassment and need new moderation mechanisms. This study focuses on hate raids, a form of group attack in real-time in live streaming communities. Through a qualitative analysis of hate raids discussion in the Twitch subreddit (r/Twitch), we found that (1) hate raids as a human-bot coordinated group attack leverages the live stream system to attack marginalized s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 7 publications
(1 citation statement)
references
References 99 publications
0
1
0
Order By: Relevance
“…Some preserve content in advance and appeal to the platform [4]. Additionally, marginalized communities are more susceptible to targeted harassment and abuse, and content moderation processes may disproportionately affect them [10,32]. Algorithmic moderation is often biased to the detriment of marginalized groups [6,29] and disproportionately produces false positives for marginalized groups [31], such as YouTube's demonetization of videos featuring LGBTQ+ issues [46].…”
Section: Content Moderation and Marginalizationmentioning
confidence: 99%
“…Some preserve content in advance and appeal to the platform [4]. Additionally, marginalized communities are more susceptible to targeted harassment and abuse, and content moderation processes may disproportionately affect them [10,32]. Algorithmic moderation is often biased to the detriment of marginalized groups [6,29] and disproportionately produces false positives for marginalized groups [31], such as YouTube's demonetization of videos featuring LGBTQ+ issues [46].…”
Section: Content Moderation and Marginalizationmentioning
confidence: 99%