This paper discusses the new phenomenon of platform ad archives. Over the past year, leading social media platforms have installed publicly accessible databases documenting their political advertisements, and several countries have moved to regulate them. If designed and implemented properly, ad archives can correct for structural informational asymmetries in the online advertising industry, and thereby improve accountability through litigation and through publicity. However, present implementations leave much to be desired. We discuss key criticisms, suggest several improvements and identify areas for future research and debate.
We live in an era of increasing worry that internet platforms like Facebook or Twitter, which mediate our online speech, are also fomenting hatred, spreading misinformation, and distorting political outcomes. The 2016 US presidential election, in particular, unleashed a torrent of concern about platform-borne harms. Policymakers around the world have called for laws requiring platforms to do more to combat illegal and even merely "harmful" content.From the perspective of platforms themselves, these proposals have a lot in common. Regardless of their substantive mandatesto address content that is misleading, hateful, or violent, for examplethey all require similar operational processes to comply. Platforms already have these processes in place to enforce current laws and their discretionary Community Guidelines. Any new efforts to regulate content online will likely build on existing systems, personnel, and toolsand inherit both their strengths and their weaknesses. That makes it important to understand those systems.Reliable information about platforms' content-removal systems was, for many years, hard to come by; but data and disclosures are steadily emerging as researchers focus on the topic and platforms ramp up their transparency efforts. This chapter reviews the current and likely future sources of information.Some content takedowns are required by law, while others are performed voluntarily. Legal takedowns are shaped by intermediary liability laws, which tell platforms what responsibility they have for unlawful content posted by
This paper discusses the new phenomenon of platform ad archives. Over the past year, leading social media platforms have installed publicly accessible databases documenting their political advertisements, and several countries have moved to regulate them. If designed and implemented properly, ad archives can correct for structural informational asymmetries in the online advertising industry, and thereby improve accountability through litigation and through publicity. However, present implementations leave much to be desired. We discuss key criticisms, suggest several improvements and identify areas for future research and debate.
Social media recommender systems play a central role in determining what content is seen online, and what remains hidden. As a point of control for media governance, they are subject to intense controversy and, increasingly, regulation by European policymakers. A recurring theme in such efforts is transparency, but this is an ambiguous concept that can be implemented in various ways depending on the types of accountability one envisages. This paper maps and critiques the various efforts at regulating social media recommendation transparency in Europe, and the types of accountability they pursue. This paper identifies three different categories of disclosure rules in recent policymaking: (1) user-facing disclaimers, (2) government auditing and (3) data-sharing partnerships with academia and civil society. Despite their limitations and pitfalls, it is argued, each of these approaches has a potential added value for media governance as part of a tiered, variegated landscape of transparency rules. However, an important element is missing: public disclosures. Given the deeply political and value-laden context of media governance, it is argued, this field cannot rely exclusively on technocratic, institutionalized forms of transparency emphasized in current proposals. The final section articulates the distinct benefits of public disclosures as a supplement to existing transparency measures, and suggests starting points for their design and regulation.
Social media recommender systems play a central role in determining what content is seen online, and what remains hidden. As a point of control for media governance, they are subject to intense controversy and, increasingly, regulation by European policymakers. A recurring theme in such efforts is transparency, but this is an ambiguous concept that can be implemented in various ways depending on the types of accountability one envisages. This paper maps and critiques the various efforts at regulating social media recommendation transparency in Europe, and the types of accountability they pursue. This paper identifies three different categories of disclosure rules in recent policymaking: (1) userfacing disclaimers, (2) government auditing and (3) data-sharing partnerships with academia and civil society. Despite their limitations and pitfalls, it is argued, each of these approaches has a potential added value for media governance as part of a tiered, variegated landscape of transparency rules. However, an important element is missing: public disclosures. Given the deeply political and value-laden context of media governance, it is argued, this field cannot rely exclusively on technocratic, institutionalized forms of transparency emphasized in current proposals. The final section articulates the distinct benefits of public disclosures as a supplement to existing transparency measures, and suggests starting points for their design and regulation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.