2019
DOI: 10.1145/3359276
|View full text |Cite
|
Sign up to set email alerts
|

Crossmod: A Cross-Community Learning-based System to Assist Reddit Moderators

Abstract: In this paper, we introduce a novel sociotechnical moderation system for Reddit called Crossmod. Through formative interviews with 11 active moderators from 10 different subreddits, we learned about the limitations of currently available automated tools, and how a new system could extend their capabilities. Developed out of these interviews, Crossmod makes its decisions based on cross-community learning---an approach that leverages a large corpus of previous moderator decisions via an ensemble of classifiers. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
74
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 107 publications
(74 citation statements)
references
References 35 publications
0
74
0
Order By: Relevance
“…In many cases, these automation tools are not, or do not start out, embedded in a platform but grow separately as one-off bespoke pieces of software [22], making them more difficult to author and manage. Additional tools exist that help mods create better policies [8,52] but still are focused on the role of mods. Finally, some tools exist that are aimed at regular members of a community [23,32,46], but these are primarily only used to enact policies for an individual as opposed to a group.…”
Section: Background and Motivationmentioning
confidence: 99%
“…In many cases, these automation tools are not, or do not start out, embedded in a platform but grow separately as one-off bespoke pieces of software [22], making them more difficult to author and manage. Additional tools exist that help mods create better policies [8,52] but still are focused on the role of mods. Finally, some tools exist that are aimed at regular members of a community [23,32,46], but these are primarily only used to enact policies for an individual as opposed to a group.…”
Section: Background and Motivationmentioning
confidence: 99%
“…As of early 2016, none of the 15 major platforms they analyzed provided a specific definition for harassment in any of the 56 documents they collected, and only Twitter and Instagram provided descriptions of behaviors that were considered when determining whether actions would be defined as harassment. 11 In complementary work, West studied users' reactions to content removal and folk theories about how those systems work, noting that users are often left to speculate about reasons for removal due to a general lack of transparency in moderation actions [103]. Suzor et al proposed specific ways in which increased transparency could help educate users and establish a sense of trust in these processes [97].…”
Section: Structural and Functional Perspectivesmentioning
confidence: 99%
“…Chandrasekharan et al avoid defining what behaviors are "problematic" by using external communities' definitions of problematic behavior to define rules for a new community. This results in a classifier that works not from consciously defined rules but from the aggregation of prior users' moderation decisions [11]. Chandrasekharan et al take an important step by choosing to use this classifier in a way that brings comments to the attention of moderators rather than removing the comments without human oversight.…”
Section: Social and Technicalmentioning
confidence: 99%
See 2 more Smart Citations