Social Media and Democracy 2020
DOI: 10.1017/9781108890960.011
|View full text |Cite
|
Sign up to set email alerts
|

Facts and Where to Find Them: Empirical Research on Internet Platforms and Content Moderation

Abstract: We live in an era of increasing worry that internet platforms like Facebook or Twitter, which mediate our online speech, are also fomenting hatred, spreading misinformation, and distorting political outcomes. The 2016 US presidential election, in particular, unleashed a torrent of concern about platform-borne harms. Policymakers around the world have called for laws requiring platforms to do more to combat illegal and even merely "harmful" content.From the perspective of platforms themselves, these proposals h… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 17 publications
(12 citation statements)
references
References 522 publications
0
11
0
Order By: Relevance
“…In fact, as private companies, they are free to enforce any internal policy they choose [53]. Interviews with insiders and employees of these companies [40,42] show that these policies are determined via "internal debates, appeals by practitioners, and outside pressure" and are applied in opaque ways. Unfortunately, the Library does not reveal the advertisements that have been rejected by Facebook.…”
Section: Discussionmentioning
confidence: 99%
“…In fact, as private companies, they are free to enforce any internal policy they choose [53]. Interviews with insiders and employees of these companies [40,42] show that these policies are determined via "internal debates, appeals by practitioners, and outside pressure" and are applied in opaque ways. Unfortunately, the Library does not reveal the advertisements that have been rejected by Facebook.…”
Section: Discussionmentioning
confidence: 99%
“…However, research has shown that a consequence of stringent filtering systems and censorship is that those who have been deplatformed from mainstream sites may migrate to private chats and encrypted sites like Telegram (Bloom et al, 2019), which, although often smaller and offering extremists a reduced audience, are perceived by users as more secretive through their encoding, making it far more difficult for public monitoring of misuse and emerging threats. Concern has also arisen with respect to the possibility of subjective, disparate and potentially biased enforcement of content and creators by different platforms resulting from the absence of internationally agreed upon definitions of what constitutes terrorism and violent extremism (Keller & Leerssen, 2020). Moreover, transparency regarding the tools and approaches used by social media companies for content moderation is highly limited and the absence of details regarding their implementation makes evaluating their effectiveness difficult (United Nations, 2021).…”
Section: Literary Overviewmentioning
confidence: 99%
“…For the most part, moderators review content only after it has been flagged and brought to their attention, either algorithmically or by users themselves (Klonick, 2018). Yet, mounting pressure from governments and public groups in recent years has nudged platforms to adopt a more proactive approach to the detection and removal of content that violates their own private rules or public laws (Keller et al, 2020;Gillespie, 2020).…”
Section: Content Moderation As a Political Issuementioning
confidence: 99%