2020
DOI: 10.1177/1461444820912540
|View full text |Cite
|
Sign up to set email alerts
|

Content moderation: Social media’s sexist assemblages

Abstract: This article proposes ‘sexist assemblages’ as a way of understanding how the human and mechanical elements that make up social media content moderation assemble to perpetuate normative gender roles, particularly white femininities, and to police content related to women and their bodies. It investigates sexist assemblages through three of many potential elements: (1) the normatively gendered content presented to users through in-platform keyword and hashtag searches; (2) social media platforms’ community guide… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
32
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 74 publications
(38 citation statements)
references
References 52 publications
1
32
0
1
Order By: Relevance
“…Assemblages, drawing from Deleuze and Guattari [21] and Bucher [10], refers to social media platforms' dynamic and complex content moderation processes which bring together multiple elements, including both algorithmic and human techniques, to impose policy. Gerrard and Thornham [41] posited that similar silencing likely also occurs for other marginalized social media users, and our work provides evidence for this claim and describes how trans and Black people experience such anti-trans and racist assemblages. While conservatives also experience silencing and perceive anti-conservative bias on social media sites, there is a vast diference between silencing conservatives' misinformation and hate speech and silencing trans and Black users' personal identity-related content.…”
Section: Discussionsupporting
confidence: 61%
See 1 more Smart Citation
“…Assemblages, drawing from Deleuze and Guattari [21] and Bucher [10], refers to social media platforms' dynamic and complex content moderation processes which bring together multiple elements, including both algorithmic and human techniques, to impose policy. Gerrard and Thornham [41] posited that similar silencing likely also occurs for other marginalized social media users, and our work provides evidence for this claim and describes how trans and Black people experience such anti-trans and racist assemblages. While conservatives also experience silencing and perceive anti-conservative bias on social media sites, there is a vast diference between silencing conservatives' misinformation and hate speech and silencing trans and Black users' personal identity-related content.…”
Section: Discussionsupporting
confidence: 61%
“…Our results provide empirical evidence for some of the ways social media content moderation silences marginalized groups like Black and trans people. Gerrard and Thornham [41] described social media platforms' prescriptive power in a feminist context by introducing the concept of 'sexist assemblages,' which describes how social media content moderation "perpetuate[s] normative gender roles, particularly white femininities, and police[s] content related to women and their bodies." Assemblages, drawing from Deleuze and Guattari [21] and Bucher [10], refers to social media platforms' dynamic and complex content moderation processes which bring together multiple elements, including both algorithmic and human techniques, to impose policy.…”
Section: Discussionmentioning
confidence: 99%
“…While we acknowledge there are benefits to content moderation in some scenarios, such as removing hate speech, there are also concerns such practices present. For example, content moderation has been shown to perpetuate systems of sexism (Gerrard & Thornham, 2020). Such biases are also prevalent when these systems are automated, as these systems perpetuate the viewpoints of their trainers, which represent a nonrepresentative subset of viewpoints on the contested norms of what is appropriate content for the platform (Binns et al, 2017;Gillespie, 2020).…”
Section: Ethical Considerationsmentioning
confidence: 99%
“…It is also important to continually highlight that while this is happening in the media and communication space, particularly in the social media space, it is representative of how policymaking should be undertaken more broadly. It is a continuing discussion between the users (Hoffmann et al, 2018), the technologies (Hutchinson, 2021), the socio-cultures (Cabalquinto & Soriano, 2020), the moderators (Gerrard & Thornham, 2020), the researchers (Gillespie et al, 2020) and the policymakers that enable good policy design and implementation.…”
Section: The Stretch Of Platform-oriented Regulationmentioning
confidence: 99%