2020
DOI: 10.1145/3415178
|View full text |Cite
|
Sign up to set email alerts
|

Reconsidering Self-Moderation

Abstract: Research in online content moderation has a long history of exploring different forms that moderation can take, including both user-driven moderation models on community-based platforms like Wikipedia, Facebook Groups, and Reddit, and centralized corporate moderation models on platforms like Twitter and Instagram. In this work I review different approaches to moderation research with the goal of providing a roadmap for researchers studying community self-moderation. I contrast community-based moderation resear… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
27
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 79 publications
(27 citation statements)
references
References 75 publications
0
27
0
Order By: Relevance
“…Online content moderation is one of the central processes through which public discourse is negotiated among different actors including users, social media platforms, and governmental regulators. In many content moderation systems, the most common and feasible way for ordinary internet users to moderate content is online reporting (Buni and Chemaly, 2016;Seering, 2020).…”
Section: Online Reporting and Weaponized Content Moderationmentioning
confidence: 99%
“…Online content moderation is one of the central processes through which public discourse is negotiated among different actors including users, social media platforms, and governmental regulators. In many content moderation systems, the most common and feasible way for ordinary internet users to moderate content is online reporting (Buni and Chemaly, 2016;Seering, 2020).…”
Section: Online Reporting and Weaponized Content Moderationmentioning
confidence: 99%
“…For example, the simple task of determining whether a contribution should be rejected could be performed by a human reviewing each contribution manually or by a system applying a predefined list of prohibited terms automatically. Recent advancements have helped to broaden the scope of content moderation from a platform-centered and regulatory perspective [40] to a perspective that centers on community-driven decision-making [69] within an increasingly complex political landscape [37,38].…”
Section: Moderators Advance Social Norms By Making Decisions On Contentmentioning
confidence: 99%
“…A community might organize moderators into various configurations (e.g., ad hoc juries [26], teams [22,70]). Additionally, moderators are situated within the social organization of their community [69] and platform [34,68].…”
Section: Content Moderation Decisions Are Made By People Teams and Te...mentioning
confidence: 99%
See 1 more Smart Citation
“…As such, they are a first point of intervention for preventing and countering several criminal, deviant, or otherwise harmful activities online, in the effort to create, nurture or maintain a better social media platform ecosystem. Following Seering's (2020) distinction among ‘the platforms and policies perspective’ and ‘the communities perspective’ in moderation research, acknowledging that previous research has already stressed how platforms generally allow users significant leeway to self-moderate (for a review, see Seering, 2020), and fully recognising the importance of intra-group moderation, this study departs by looking at platforms and policies as – we claim – these provide the overarching frame within which also users’ intra-group moderation occurs.…”
Section: Introductionmentioning
confidence: 99%