This article describes domestic violence as a key context of online misogyny, foregrounding the role of digital media in mediating, coordinating, and regulating it; and proposing an agenda for future research. Scholars and anti-violence advocates have documented the ways digital media exacerbate existing patterns of gendered violence and introduce new modes of abuse, a trend highlighted by this special issue. We propose the term "technology facilitated coercive control" (TFCC) to encompass the technological and relational aspects of patterns of abuse against intimate partners. Our definition of TFCC is grounded in the understanding of domestic violence (DV) as coercive, controlling, and profoundly contextualised in relationship dynamics, cultural norms, and structural inequality. We situate TFCC within the multiple affordances and modes of governance of digital media platforms for amplifying and ameliorating abuse. In addition to investigating TFCC, scholars are beginning to document the ways platforms can engender countermisogynistic discourse, and are powerful actors for positive change via the regulation and governance of online abuse. Accordingly, we propose four key directions for a TFCC research agenda that recognises and asks new questions about the role of digital media platforms as both facilitators of abuse and potential partners in TFCC prevention and intervention.
Lawless is about the power that technology companies have over our lives and how we can develop a new constitutionalism to better protect our rights.Social media platforms, search engines, and other technology companies influence what we can see and say online. These giant companies govern our behavior online without real accountability, and they are at the centre of fierce battles between governments, lobby groups, the media, and grassroots campaigns from activists. Drawing on ten years of research, this book shows how our social lives, our news, and our information environments are shaped by a complex web of legal, technical, and social forces.This is a book about the future of our media and our shared social spaces. We are now at a constitutional moment—a time when we can all demand better from the companies that govern our lives. This book provides a guide to a new constitutionalism: real limits on power that protect human rights in a decentralized environment. Ultimately, it provides a comprehensive argument about how we should expect the governance of online social spaces to be more legitimate – and particularly, how we might develop new forms of due process for the algorithmic and human decision making systems that rule our digital lives.
This article identifies the current global ‘techlash’ towards the major digital and social media platforms as providing the context for a renewed debate about whether these digital platform companies are effectively media companies (publishers and broadcasters of media content), and implications this has for twenty-first-century media policy. It identifies content moderation as a critical site around which such debates are being played out, and considers the challenges arising as national and regionally based regulatory options are considered for digital platforms that are ‘born global’. It considers the shifting balance between the ‘social contract’ of public interest obligations and democratic rights of free speech and freedom of expression.
Leaked documents, press coverage, and user protests have increasingly drawn attention to social media platforms’ seemingly contradictory governance practices. We investigate the governance approaches of Tinder, Instagram, and Vine through detailed analyses of each platform, using the ‘walkthrough method’ (Light, Burgess, and Duguay, 2016 The walkthrough method: An approach to the study of apps. New Media & Society 20(3).), as well as interviews with their queer female users. Across these three platforms, we identify a common approach we call ‘patchwork platform governance’: one that relies on formal policies and content moderation mechanisms but pays little attention to dominant platform technocultures (including both developer cultures and cultures of use) and their sustaining architectures. Our analysis of these platforms and reported user experiences shows that formal governance measures like Terms of Service and flagging mechanisms did not protect users from harassment, discrimination, and censorship. Key components of the platforms’ architectures, including cross-platform connectivity, hashtag filtering, and algorithmic recommendation systems, reinforced these technocultures. This significantly limited queer women’s ability to participate and be visible on these platforms, as they often self-censored to avoid harassment, reduced the scope of their activities, or left the platform altogether. Based on these findings, we argue that there is a need for platforms to take more systematic approaches to governance that comprehensively consider the role of a platform’s architecture in shaping and sustaining dominant technocultures.
Platforms govern users, and the way that platforms govern matters. In this article, I propose that the legitimacy of governance of users by platforms should be evaluated against the values of the rule of law. In particular, I suggest that we should care deeply about the extent to which private governance is consensual, transparent, equally applied and relatively stable, and fairly enforced. These are the core values of good governance, but are alien to the systems of contract law that currently underpin relationships between platforms and their users. Through an analysis of the contractual Terms of Service of 14 major social media platforms, I show how these values can be applied to evaluate governance, and how poorly platforms perform on these criteria. I argue that the values of the rule of law provide a language to name and work through contested concerns about the relationship between platforms and their users. This is an increasingly urgent task. Finding a way to apply these values to articulate a set of desirable restraints on the exercise of power in the digital age is the key challenge and opportunity of the project of digital constitutionalism.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.