This article proposes ‘sexist assemblages’ as a way of understanding how the human and mechanical elements that make up social media content moderation assemble to perpetuate normative gender roles, particularly white femininities, and to police content related to women and their bodies. It investigates sexist assemblages through three of many potential elements: (1) the normatively gendered content presented to users through in-platform keyword and hashtag searches; (2) social media platforms’ community guidelines, which lay out platforms’ codes of conduct and reveal biases and subjectivities and (3) the over-simplification of gender identities that is necessary to algorithmically recommend content to users as they move through platforms. By the time the reader finds this article, the elements of the assemblages we identify might have shifted, but we hope the framework remains useful for those aiming to understand the relationship between content moderation and long-standing forms of inequality.
This article draws on work from a 6-month project with 12 young mothers in which we mapped and tracked ourselves and our infants. The project employed a range of methods including digital ethnographies, walk-along methods, hacking and playful experimentations. We explored, broke and tested a range of wearables and phone-based tracking apps, meeting regularly to discuss and compare our experiences and interrogate the sociotechnical systems of postnatal healthcare alongside the particular politics of certain apps and their connective affordances. In this article, I use the project as a springboard to explore what I call algorithmic vulnerabilities: the ways that the contemporary datalogical anthropocene is exposing and positioning subjects in ways that not only rarely match their own lived senses of identity but are also increasingly difficult to interrupt or disrupt. While this is not necessarily a new phenomenon (see Clough et al., 2015; Hayles, 2017), I argue that the particular algorithmic vulnerabilities within this context, which are forged in part through the ideological enmeshing of the long-running atomization of maternal and infant bodies within the healthcare systems (Crowe, 1987; Shaw, 2012; Wajcman, 1991) and the new and emergent tracking apps (Greenfield, 2016; Lupton, 2016; O’Riordan, 2017) create momentary stabilizations of sociotechnical systems in which maternal subjectivity and female embodiment become algorithmically vulnerable in affective and profound ways. These stabilizations become increasingly and problematically normative, partly because they feed and perpetuate a wider ‘taken-for granted’ sensibility of gendered neoliberalism (Gill, 2017: 609) which, as I argue, is coming to encapsulate the contemporary datalogical anthropocene. Secondly, the sociotechnical politics of the apps and the healthcare systems are revealed as co-dependent, raising a number of questions about long-term algorithmic vulnerabilities and normativities which predate the contemporary datalogical ‘turn’ and impact both practices and methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.