2019
DOI: 10.5204/lthj.v1i0.1386
|View full text |Cite
|
Sign up to set email alerts
|

Virginia Eubanks (2018) Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. New York: Picador, St Martin’s Press

Abstract: Law, Technology and Humans book review editor Dr Faith Gordon reviews Virginia Eubanks (2018) Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 24 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…This is not to belittle the inequalities that algorithms create for people's lives, nor do we wish to downplay the effects of automation and automated decision-making on major domains of social life such as welfare, policing, border management, etc. (Eubanks 2018;Benjamin 2019;Kaufman and Leese 2018). Rather, we wish to complement prevailing critiques by emphasising the ways in which the 'rule making' (Katzenbach and Ulbricht 2019, 2) principle of algorithms is influenced by different operational, legislative, and sociocultural contexts.…”
Section: The Emerging Field Of Critical Algorithm Studiesmentioning
confidence: 99%
“…This is not to belittle the inequalities that algorithms create for people's lives, nor do we wish to downplay the effects of automation and automated decision-making on major domains of social life such as welfare, policing, border management, etc. (Eubanks 2018;Benjamin 2019;Kaufman and Leese 2018). Rather, we wish to complement prevailing critiques by emphasising the ways in which the 'rule making' (Katzenbach and Ulbricht 2019, 2) principle of algorithms is influenced by different operational, legislative, and sociocultural contexts.…”
Section: The Emerging Field Of Critical Algorithm Studiesmentioning
confidence: 99%
“…Psychobiographies of journalists (Kõuts-Klemm, 2019), designers of algorithms (Svensson, forthcoming) and editors and media makers (Diakopoulos, 2019) are known to orient their data practices. A literature is emerging that documents the biases inscribed in algorithms (Noble, 2018), for instance, as designed by and for white people or men (Eubanks, 2017). There is also a literature showing how media actors may lack the skills to interpret user data.…”
Section: Vignette 4 Participating On Goodreadsmentioning
confidence: 99%