« Heat List: Correlation or Causation | Main | Graduated Fines for Red Light Violations »

Predicting Police Misconduct

In several articles over at fivethirtyeight.com, Rob Arthur has provided a great background on the use of predictive algorithms by the Chicago police department to identify potential police misconduct. It goes back to 1996 and the CPD trying out a new neural network predictive algorithm, BrainMaker Professional, as covered in the Tribune.

 Please read the article for the full story and the Tribune, but here is snippet:

The list of predictive factors Internal Affairs found using the software is consistent with other studies of police misconduct, including my own. Along with each officer’s past history of complaints, Internal Affairs identified personal stressors linked to bad behavior. If an officer had recently divorced or gone into serious debt, for example, he was flagged by the algorithm as more likely to commit misconduct in the future. Like employees of any other kind, cops are likely to see their job performance suffer when there is trouble in their personal lives.

The neural network didn’t last long: about two years from the first announcement to its formal shutdown. (And all its reports and predictions went missing at some point in that period.) Soon after the model produced its first predictions, the union intervened; its president, Bill Nolan, called the system “absolutely ludicrous.” In particular, he objected to the way administrators responded to the predictions: Internal Affairs handed over a list of about 200 officers to Human Resources, which called each one into the office for questioning the union called adversarial.2

Human Resources then recommended some officers for a counseling program (about half of the flagged officers were already enrolled in counseling because of previous bad behavior). Nolan said police officers were being punished for crimes they had not yet committed.

At the time, the notion of using predictive analytics to forecast potentially criminal behavior was still quite foreign. Although 27 percent of departments reported using some kind of early warning system in 1999 (according to a Department of Justice study), most existing models were simple, based either on supervisor observations or on an officer’s exceeding a certain number of complaints in a given period. (Both Chicago’s current system and Charlotte’s previous algorithm use such thresholds.) The idea of a more sophisticated algorithm seemed spooky back then, and union leaders called the Chicago PD’s model a “crystal-ball thing.” Mark Lawrence, the CEO of California Scientific, received a handful of inquiries from other police departments, but he said interest in his software dropped off rapidly after the union’s well-publicized objections. (The Fraternal Order of Police did not respond to multiple requests for comment for this story.)

References (1)

References allow you to track sources for this article, as well as articles that were written in response to this article.

Reader Comments

There are no comments for this journal entry. To create a new comment, use the form below.

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>