2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC) 2018
DOI: 10.1109/smc.2018.00421
|View full text |Cite
|
Sign up to set email alerts
|

A Penalized Likelihood Method for Balancing Accuracy and Fairness in Predictive Policing

Abstract: Racial bias of predictive policing algorithms has been the focus of recent research and, in the case of Hawkes processes, feedback loops are possible where biased arrests are amplified through self-excitation, leading to hotspot formation and further arrests of minority populations. In this article we develop a penalized likelihood approach for introducing demographic parity into point process models of crime. In particular, we add a penalty term to the likelihood function that encourages the amount of police … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
14
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 23 publications
(15 citation statements)
references
References 19 publications
1
14
0
Order By: Relevance
“…While hotspot policing has been shown to yield crime rate reductions, there is the possibility of unwanted side effects of hotspot policing such as traffic stops that unfairly target minority populations, stop and frisk, and other police activities that have negative societal consequences. There has been some recent work on improving fairness of spatial crime forecasting algorithms (Wheeler 2019;Mohler et al 2018) where a fairness penalty is added to the optimization algorithm. Future research may focus on incorporating fairness into learning to rank models of crime, similar to methods that incorporate fairness into learning to rank for information retrieval (Zehlike and Castillo 2018).…”
Section: Figmentioning
confidence: 99%
“…While hotspot policing has been shown to yield crime rate reductions, there is the possibility of unwanted side effects of hotspot policing such as traffic stops that unfairly target minority populations, stop and frisk, and other police activities that have negative societal consequences. There has been some recent work on improving fairness of spatial crime forecasting algorithms (Wheeler 2019;Mohler et al 2018) where a fairness penalty is added to the optimization algorithm. Future research may focus on incorporating fairness into learning to rank models of crime, similar to methods that incorporate fairness into learning to rank for information retrieval (Zehlike and Castillo 2018).…”
Section: Figmentioning
confidence: 99%
“…In particular, it should reduce the number of minorities stopped and arrested by police. But that reduction comes with a cost of not equivocally targeting the highest crime locations in a hot spots policing strategy (Mohler et al, 2018).…”
Section: Discussionmentioning
confidence: 99%
“…Current critiques of predictive algorithms often state they ignore how historical processes shape the current data that is fed into the system, and as such likely perpetuate the current statusquo (Harcourt, 2007;Lum & Isaac, 2016). Predictive algorithms are not limited to a simple objective of improving forecast accuracy though -they can be amended to take into account additional objectives (Hardt, Price, & Srebro, 2016;Mohler et al, 2018). As opposed to saying predictive algorithms will hopelessly exacerbate racial disparities already prevalent in the criminal justice system, this is but one example of how they can be amended to attempt to reduce racial disparity.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations