2019
DOI: 10.14763/2019.2.1407
|View full text |Cite
|
Sign up to set email alerts
|

The platform governance triangle: conceptualising the informal regulation of online content

Abstract: From the new Facebook 'Oversight Body' for content moderation to the 'Christchurch Call to eliminate terrorism and violent extremism online,' a growing number of voluntary and non-binding informal governance initiatives have recently been proposed as attractive ways to rein in Facebook, Google, and other platform companies hosting user-generated content.Drawing on the literature on transnational corporate governance, this article reviews a number of informal arrangements governing online content on platforms i… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
62
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 122 publications
(63 citation statements)
references
References 35 publications
0
62
0
1
Order By: Relevance
“…Since the 2016 US election, there has been a substantial increase in public attention paid to content moderation issues -now widely seen as a crucial element of major tech and platform policy debates -as well as broader academic awareness of the problems with the platform governance status quo (Gorwa, 2019b). A growing body of scholarship has documented the multiple challenges with commercial content moderation as enacted by platforms today, ranging from labour concerns (about the taxing working conditions and mental health challenges faced by moderators, many of whom are outsourced contractors in the Global South); democratic legitimacy concerns (about global speech rules being set by a relatively homogenous group of Silicon Valley elites); and process concerns about the overall lack of transparency and accountability (see Gillespie, 2018;Kaye, 2019;Roberts, 2018;Suzor et al, 2019).…”
Section: Turning To Ai For Moderation At Scalementioning
confidence: 99%
See 1 more Smart Citation
“…Since the 2016 US election, there has been a substantial increase in public attention paid to content moderation issues -now widely seen as a crucial element of major tech and platform policy debates -as well as broader academic awareness of the problems with the platform governance status quo (Gorwa, 2019b). A growing body of scholarship has documented the multiple challenges with commercial content moderation as enacted by platforms today, ranging from labour concerns (about the taxing working conditions and mental health challenges faced by moderators, many of whom are outsourced contractors in the Global South); democratic legitimacy concerns (about global speech rules being set by a relatively homogenous group of Silicon Valley elites); and process concerns about the overall lack of transparency and accountability (see Gillespie, 2018;Kaye, 2019;Roberts, 2018;Suzor et al, 2019).…”
Section: Turning To Ai For Moderation At Scalementioning
confidence: 99%
“…Content moderation has long been a famously opaque and secretive process (Gillespie, 2018;Roberts, 2019;Suzor, 2019). Years of pressure by researchers, journalists and activists have recently led to notable efforts by companies like Facebook to make their moderation practices more transparent, such as the long-overdue publication of the 'Community Standards' that outline the bounds of acceptable behaviour on the site, the instigation of a formal appeals process, and an effort to create some kind of independent oversight mechanism into their policies (Kadri and Klonick, 2019).…”
Section: Decisional Transparencymentioning
confidence: 99%
“…This points towards the larger societal complications caused by the structural change in political communication. Big data analytics and algorithmic governance have increasingly become a new form of governance in societies (Gorwa 2019a(Gorwa , 2019b, opening up new spaces of knowledge and control that did not previously exist, the implications of which are only now becoming evident (Amoore and Piotukh 2015). While bots, trolls and political advertisements may be short-term problems solvable by digital platforms in cooperation with governments globally, a far more fundamental challenge is the changing power dynamics between the two (Helberger, Pierson, and Poell 2018).…”
Section: Background Definitions Literaturementioning
confidence: 99%
“…122 Academia and civil society are also increasingly represented in the voluntary selfregulatory organs devised by major social media platforms, from the long-standing Global Network Initiative, which conducts audits of platform content moderation practices, to Facebook's novel and widely-publicized Oversight Board. 123 More open-ended calls to enable independent research can be found in the EU Code of Practice on Disinformation. Its signatories have committed to "empower the research community", which includes "sharing privacy protected datasets, undertaking joint research, or otherwise partnering with academics and civil society organizations if relevant and possible"; and to "convene an annual event to foster discussions within academia, the fact-checking community and members of the value chain".…”
Section: Research Partnerships With Academia and Civil Societymentioning
confidence: 99%