“…Researchers have so far focussed on the human labour behind content moderation (Roberts, 2016(Roberts, , 2017b(Roberts, , 2019Carmi, 2019), social media platforms' changing responsibilities (Gillespie, 2015(Gillespie, , 2018Suzor, 2019), users' experiences of platforms' interventions (Duguay et al, 2018;Gerrard, 2018;Myers-West, 2018), and community-driven forms of moderation (Lo, 2018;Seering et al, 2019;Squirrell, 2019). Uniting this research is a focus on humans and machines, partly through the legacy of Science and Technology Studies (STS) scholarship (for example Wajcman, 1991;Suchman, 2007;Barad, 2009) but also because of the increasing need to understand how social norms 'leak across', to use Cheney-Lippold's term (2017:143), to content moderation processes and vice versa.…”