As in other industries, police departments and court systems are using predictive models. In Chicago, we know this as the heat list. The heat list is an algorithm that ranks individuals on their susceptibility to violence. This algorithm has been used by the CPD for several years and been mentioned in recent arrests. The creator of the heat list, Dr. Wernick, is on record that the CPD’s predictive program isn’t taking advantage of — or unfairly profiling — any specific group. "The novelty of our approach," he says, "is that we are attempting to evaluate the risk of violence in an unbiased, quantitative way." The heat list is said not to incorporate any variables that could racially profile a person.
For scholars, it’s been difficult to verify claims that these algorithms are unbiased. A recent study by ProPublica has shed light on the racial biases of a predicting policing algorithsm. ProPublica studied a very widely used algorithm created by Northpointe, which assesses the likelihood a suspect will commit a crime. In many ways, this is similar to Chicago’s heat list. The key to ProPublica’s analysis is that they compared the predictions of the algorithm against what actually happened. Study the table below. It becomes apparent that the algorithm is biased against African-Americans and favors Whites! This is an algorithm that is not supposed to unfairly profile, yet it is.
These sorts of biases often happen with predictive algorithms. To prevent against this, the CPD should conduct a similar publicly available analysis to identify any potential biases in their predictive algorithms. As an example, ProPublica has put their data and analysis online.