2019
DOI: 10.1002/bsl.2406
|View full text |Cite
|
Sign up to set email alerts
|

The sexist algorithm

Abstract: Algorithmic risk assessment tools are informed by scientific research concerning which factors are predictive of recidivism and thus support the evidence‐based practice movement in criminal justice. Automated assessments of individualized risk (low, medium, high) permit officials to make more effective management decisions. Computer‐generated algorithms appear to be objective and neutral. But are these algorithms actually fair? The focus herein is on gender equity. Studies confirm that women typically have far… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 42 publications
(23 citation statements)
references
References 25 publications
0
22
0
1
Order By: Relevance
“…COMPAS, an algorithm widely used in the US to guide sentencing by predicting the likelihood of a criminal reoffending, turned out to be racially biased against black defendants, according to ProPublica, because they were more likely than white defendants to be incorrectly classified as high risk ("false positives") 2 (Angwin et al, 2016). More recently, COMPAS has also been depicted as a "sexist algorithm" because its algorithmic outcomes seem to systemically overclassify women in higher-risk groups (Hamilton, 2019). Similarly, Predpol, an algorithm designed to predict when and where crimes will take place, already in use in several US states, in 2016 -after an analysis of the Human Rights Data Analysis Group -was found to result in police unfairly targeting certain neighborhoods.…”
Section: Ethical and Legal Challengesmentioning
confidence: 99%
“…COMPAS, an algorithm widely used in the US to guide sentencing by predicting the likelihood of a criminal reoffending, turned out to be racially biased against black defendants, according to ProPublica, because they were more likely than white defendants to be incorrectly classified as high risk ("false positives") 2 (Angwin et al, 2016). More recently, COMPAS has also been depicted as a "sexist algorithm" because its algorithmic outcomes seem to systemically overclassify women in higher-risk groups (Hamilton, 2019). Similarly, Predpol, an algorithm designed to predict when and where crimes will take place, already in use in several US states, in 2016 -after an analysis of the Human Rights Data Analysis Group -was found to result in police unfairly targeting certain neighborhoods.…”
Section: Ethical and Legal Challengesmentioning
confidence: 99%
“…From a population or public health perspective, if the validity of the outputs differs systematically across subpopulations then the use of the model to guide decisions in practice can exacerbate health inequalities. For example, in a justice context, a model predicting recidivism to guide decisions about granting parole, could increase sex-based inequalities if it systematically overpredicts recidivism in women [36]. In a healthcare context, a model that systematically underpredicts the resources needed by black patients, could increase racial inequalities if it is used to direct proportionally more resources to white patients [37].…”
Section: Prediction and Equitymentioning
confidence: 99%
“…The other articles in this Special Issue focus on particular uses of statistics in the criminal justice setting. Melissa Hamilton's article, “The Sexist Algorithm”, looks at one of the more recent manifestations of statistical analysis – the algorithm (Hamilton, pp. 145–157).…”
Section: This Special Issuementioning
confidence: 99%