2016
DOI: 10.1080/1369118x.2016.1153700
|View full text |Cite
|
Sign up to set email alerts
|

Bot-based collective blocklists in Twitter: the counterpublic moderation of harassment in a networked public space

Abstract: This article introduces and discusses bot-based collective blocklists (or blockbots) in Twitter, which have been developed by volunteers to combat harassment in the social networking site. Blockbots support the curation of a shared blocklist of accounts, where subscribers to a blockbot will not receive any notifications or messages from those on the blocklist. Blockbots support counterpublic communities, helping people moderate their own experiences of a site. This article provides an introduction and overview… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
50
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 97 publications
(52 citation statements)
references
References 30 publications
1
50
0
Order By: Relevance
“…A key approach to managing the problem of online harassment is by developing moderation and blocking mechanisms (Crawford and Gillespie, 2016;Lampe and Resnick, 2004;Geiger, 2016). Our findings add nuance to our understanding of the challenges of this undertaking.…”
Section: Implications For Designmentioning
confidence: 77%
“…A key approach to managing the problem of online harassment is by developing moderation and blocking mechanisms (Crawford and Gillespie, 2016;Lampe and Resnick, 2004;Geiger, 2016). Our findings add nuance to our understanding of the challenges of this undertaking.…”
Section: Implications For Designmentioning
confidence: 77%
“…Online moderation largely addresses the problem of antisocial behavior, which occurs in the form of harassment [50], cyberbullying [47], and general aggression [23]. Approaches to moderating such content include decentralized, community-driven methods [14], as well as top-down methods relying on designated community managers or moderators [33]. Prior research in this area ranges from understanding the actors involved in antisocial behavior [6,28,36,42,51] to analyzing its effects [39] to tools for identifying such behavior [38,53], and even forecasting future instances [31,55].…”
Section: Further Related Workmentioning
confidence: 99%
“…There are further examples of this in the scholarly literature. Stuart Geiger details the use of the Twitter API to build automated 'blocklists' -when one member blocks an account it is automatically blocked for all other members of the group [20]. This mass-blocking feature is one that Twitter has, to date, declined to implement.…”
Section: :14mentioning
confidence: 99%