Chicago Police Department Goes High-Tech to Fight Rise in Killings

The last year has been rough in Chicago:

As the number of Chicago's homicides rose 58 percent in 2016, to 764, the clearance rate — killings that ended in an arrest — dropped from 36 percent to 26 percent, according to an analysis by the University of Chicago's Crime Lab.

The city is still leaning on technology to help fight crime, from NBCnews:

The steps announced Friday included:

  • An expansion of ShotSpotter sensors that pick up the sound of gunfire and alert police to its location.
  • Near-ubiquitous coverage of public areas by surveillance cameras.
  • The introduction of predictive policing software that identifies areas most likely to see gun violence.
  • Layering that analysis with another predictive program that identifies people most likely to commit —or be the victims of — gun violence.
  • A "war room"-like office, staffed with analysts who will translate that data for changes to deployments and long-term strategy.

Revealing the biases in an “unbiased” predictive policing model

As in other industries, police departments and court systems are using predictive models. In Chicago, we know this as the heat list.  The heat list is an algorithm that ranks individuals on their susceptibility to violence.  This algorithm has been used by the CPD for several years and been mentioned in recent arrests.  The creator of the heat list, Dr. Wernick, is on record that the CPD’s predictive program isn’t taking advantage of — or unfairly profiling — any specific group. "The novelty of our approach," he says, "is that we are attempting to evaluate the risk of violence in an unbiased, quantitative way."  The heat list is said not to incorporate any variables that could racially profile a person. 

For scholars, it’s been difficult to verify claims that these algorithms are unbiased. A recent study by ProPublica has shed light on the racial biases of a predicting policing algorithsm.  ProPublica studied a very widely used algorithm created by Northpointe, which assesses the likelihood a suspect will commit a crime.  In many ways, this is similar to Chicago’s heat list. The key to ProPublica’s analysis is that they compared the predictions of the algorithm against what actually happened.  Study the table below.  It becomes apparent that the algorithm is biased against African-Americans and favors Whites!  This is an algorithm that is not supposed to unfairly profile, yet it is. 

These sorts of biases often happen with predictive algorithms.  To prevent against this, the CPD should conduct a similar publicly available analysis to identify any potential biases in their predictive algorithms.  As an example, ProPublica has put their data and analysis online.  



Heat List: Correlation or Causation 

From the Tribune

The heat list is in the news again: 

Just last week, after the bloodiest weekend since he became superintendent, Eddie Johnson said much of the bloodshed is being driven by about 1,300 people on the list, compiled with the aid of a computerized algorithm.

Calling the initiative the Police Department's "largest raid in recent history," Anthony Guglielmi, the department's chief spokesman, said 140 people were arrested primarily on narcotics and weapons charges starting at 4 a.m. Thursday in the violence-plagued Harrison and Austin patrol districts on the West Side.  Police were targeting the Traveling Vice Lord and Four Corner Hustler street gangs in the crackdown. About 95 documented gang members were arrested, Guglielmi said.  In addition, all but 23 of those arrested Thursday were on the strategic subject list, police officials said.

"Those individuals need to know that if they don't choose to take an alternative lifestyle then we'll bring everything we have at our disposal, including our federal partners to come at them to put the weight of the Chicago Police Department on them to stop them from driving the violence in our city," Johnson said at a news conference Friday at police headquarters.

The algorithm used in compiling the list ranks the individuals on their susceptibility to violence. Some of those factors are their criminal background, their parole or warrant status, and any weapons or drug arrests. The department also takes into consideration their known acquaintances — and the acquaintances' arrest histories — and whether any of those associates have been shot in the past.

Police said 21 of those arrested in the raid have been calculated to be at least 300 times more likely than the average person to become a victim or offender of violence.

As far as I know, this is the first time the city has discussed the heat list so prominently for arrests.  A couple of quick reactions, first it’s not clear whether the arrests are correlated to the heat list or caused by the heat list. Simply, were arrests made on the basis of predictions?  Or is there a correlation between people who were arrested and being on the heat list? This will grow to be an important distinction, because eventually arrests could be based on the heat list.

Second, lets run the numbers here, 1300 people are on the list.  This weekend they arrested 140 people.  16% of those people were not on the heat list (23/140). 68% of the people arrested were known gang members (95/140).  This means 16% of the heat list people were not affiliated with a gang (of those arrested this weekend).  This gives a little insight into the makeup of the list.  Finally, what is the lasting effect of arrests based on weapons and/or narcotics?  How long will these people be off the street?     


Predicting Police Misconduct

In several articles over at, Rob Arthur has provided a great background on the use of predictive algorithms by the Chicago police department to identify potential police misconduct. It goes back to 1996 and the CPD trying out a new neural network predictive algorithm, BrainMaker Professional, as covered in the Tribune.

 Please read the article for the full story and the Tribune, but here is snippet:

The list of predictive factors Internal Affairs found using the software is consistent with other studies of police misconduct, including my own. Along with each officer’s past history of complaints, Internal Affairs identified personal stressors linked to bad behavior. If an officer had recently divorced or gone into serious debt, for example, he was flagged by the algorithm as more likely to commit misconduct in the future. Like employees of any other kind, cops are likely to see their job performance suffer when there is trouble in their personal lives.

The neural network didn’t last long: about two years from the first announcement to its formal shutdown. (And all its reports and predictions went missing at some point in that period.) Soon after the model produced its first predictions, the union intervened; its president, Bill Nolan, called the system “absolutely ludicrous.” In particular, he objected to the way administrators responded to the predictions: Internal Affairs handed over a list of about 200 officers to Human Resources, which called each one into the office for questioning the union called adversarial.2

Human Resources then recommended some officers for a counseling program (about half of the flagged officers were already enrolled in counseling because of previous bad behavior). Nolan said police officers were being punished for crimes they had not yet committed.

At the time, the notion of using predictive analytics to forecast potentially criminal behavior was still quite foreign. Although 27 percent of departments reported using some kind of early warning system in 1999 (according to a Department of Justice study), most existing models were simple, based either on supervisor observations or on an officer’s exceeding a certain number of complaints in a given period. (Both Chicago’s current system and Charlotte’s previous algorithm use such thresholds.) The idea of a more sophisticated algorithm seemed spooky back then, and union leaders called the Chicago PD’s model a “crystal-ball thing.” Mark Lawrence, the CEO of California Scientific, received a handful of inquiries from other police departments, but he said interest in his software dropped off rapidly after the union’s well-publicized objections. (The Fraternal Order of Police did not respond to multiple requests for comment for this story.)


Graduated Fines for Red Light Violations

From thenewspaper:

In many areas (including Chicago), a red light camera will issue a ticket if you take a right turn on a red light without a complete stop. These turns are much less risky than blowing straight through the intersection, nevertheless, they pay the same fine . . . 

The newspaper discusses a study, "A Method for Determining Red-Light-Running Fines and Evaluating Intersection Characteristics at Red-Light-Camera Intersections".  I have not read the paper yet, but wanted to share the newspaper's analysis.   The study develops a graduated fine based on how late a car runs a red light:

The study calculated the probability of a conflict between opposing traffic when a vehicle runs the red light. A simulation model was then used to determine the risk of whether a crash happens or not based on how long the light had been red when the vehicle enters the intersection. The probability of a crash between 0.1 and 3.5 seconds after the light turned red was found to be zero percent, largely because of the all-red clearance interval prevents conflict. The risk jumps significantly after the light has been red for 4.4 seconds or longer.

The researchers used $85,438 as an estimate of the cost of a serious red light running crash. They then proposed a red light camera ticket issued when the light has been red for less than 3.5 seconds should be $0, but it should be $200 if the light had been red for 3.8 seconds. The cost would top out at $82,985 for a ticket issued 7.0 seconds after the red.

This sort of sliding scale appears in other contexts, such as congestion pricing.  I think its an interesting idea to consider, setting violations at the degree of risk.



Chicago's Body Camera Statistics

From the PATF Report:

Body cameras are a promising technological tool to protect both the public from police misconduct and police officers from false allegations of misconduct. They promote accountability and transparency. The presence of body cameras can also de-escalate encounters, resulting in improved behavior among both police officers and the public. The commander in charge of CPD’s body camera pilot program, Marc Buslik, recently explained this phenomenon: “When they know they are being recorded, both sides, everything becomes less intense”; “[t]he camera brings everything down on both sides. Officers noticed right away.”422

CPD is already embracing the use of body cameras. In January 2015, CPD initiated a body camera pilot program.423 The program initially involved 30 body cameras on officers working the 2:00 p.m. to midnight shift in the Shakespeare District (14th). Though the sample size is small, initial results were promising. Since the program was launched, complaints filed against officers for that district/watch fell by 26%, and excessive force complaints fell to zero in 2015 compared with seven in 2014.424 In 2016, CPD is expanding the pilot program to all three watches in six additional police districts—Wentworth (2nd), South Chicago (4th), Gresham (6th), Deering (9th), Ogden (10th), and Austin (15th).425

Police departments nationwide are increasingly using body cameras. As of early 2015, about 25% of the nation’s 17,000 police agencies were using them in whole or in part, with 80% evaluating the technology.426 In Los Angeles, the LAPD is outfitting every officer with body cameras.427 While empirical data is still trickling in, several studies have documented substantial decreases in citizen complaints, use of force, and assaults on officers after body cameras were distributed.428 There is some debate about whether these declines are attributable to improved officer behavior, improved citizen behavior, or citizens being less likely to file frivolous complaints (or some mix). Regardless, these are all positive developments


422 Paul Biasco, How Chicago Police Hope Body Cameras Will Restore The Public’s Trust, (Jan. 7, 2016), available at

423 CPD, Body Worn Camera Pilot Program – Phase 1, Department Notice D15-01 (Jan. 1, 2016).

424 Id.

425 CPD, Office of News Affairs, Mayor Emanuel and Police Superintendent Escalante Announce Districts for Body-Worn Camera Expansion (Dec. 23, 2015), available at

426 Jay Stanley, Police Body-Mounted Cameras: With Right Policies in Place, A Win For All, ACLU (Mar. 2015), available at

427 Kate Mather, A Fight Over Access to Video from LAPD Body Cameras is Shaping Up, Los Angeles Times (Feb. 5, 2015), available at

428 Michael D. White, Police Officer Body-Worn Cameras, Washington D.C.: Office of Community-Orientated Policing Services (2014), available at


Chicago police soon to wear more body cameras

From the Chicago Tribune

For background, the Chicago Police Department has come under a lot of criticism for its handling of the Laquan McDonald case.  Part of this has focused on the surveillance equipment, for example five of the police cars responding had dash cams, but they were all broken.  

As part of the fallout, their has been an emphasis on police accountability using body cameras.  The city has purchased 450 body cameras capable of of recording 72 continuous hours of high-definition video and audio on a single charge.  

The new cameras, part of a city pilot program, began in January 2015 in the Shakespeare District, which covers the Logan Square and West Town communities on the North and Northwest sides. The six new police districts, which encompass one-third of the city, cover the South Shore, Auburn Gresham, Chatham, Washington Park, Hyde Park, Kenwood, Back of the Yards, Brighton Park, Bridgeport, Austin, North Lawndale and Little Village communities.

"Body cameras are one tool that the police department uses to serve and protect the people of Chicago," interim Superintendent Eddie Johnson said in a statement. "They play an important role in not just fighting crime, but also in learning from actual encounters with the public. In addition to wearing a body camera myself, I've asked my command staff to wear one as well to demonstrate our commitment to rebuilding trust with the residents we're sworn to serve."

While Johnson's wearing of a camera sounds gimmicky to me, maybe it is carrying a message to the other officers. My immediate concern is trying to understand the effects of these cameras -- do they affect the police's interaction with the public, how it affects the police in doing their work, and what happens with the footage.  One part of the article seemed to revert back to the police department of the past:

Former Superintendent Garry McCarthy was one of the major proponents of the body camera initiative during his tenure as top cop, pointing to research that showed that citizen complaints dropped by as much as 80 percent for some police departments using body cameras.  On Sunday, the department said citizen complaints against police "drastically" dropped in the first phase of the pilot program, though no statistics were given.

So why quote statistics or allude to quantitative measures when you can't back them up?  It also begs the question why did complaints drop, did officers change their behavior or did people react differently to the police wearing cameras?


Are surveillance cameras making Chicago safer?

WGN recently did a very good story on surveillance cameras in the city.

Here is a snippet:  

A spokesperson for the Cook County State's Attorney's office tells WGN-TV they can't identify a precise number of cases where cameras were utilized, though they have been helpful in some cases. Thanks to a million dollar grant, the Illinois Institute of Technology is working with the city to study the impact of cameras on crime. In the meantime, the technology is getting cheaper. The images are clearer. And by all accounts, surveillance cameras are here to stay.  You can find much more by clicking these web extras and links.

Take a look at the site for the full story as well as additional interviews.  I was interviewed for an hour by the reporters, so they really understood the issues and nuances around surveillance in Chicago.  One positive outcome of this story is that the CPD has reached out to me over my concerns about transparency and lack of empirical data on the cameras.  I plan on engaging with them.


Northwestern Red Light Camera Study

Northwestern University is getting a generous grant, $311,778, from the city of Chicago to study the effectiveness of red light cameras.  This is in response to a promise by the Chicago Department of Transportation to have an academic review on the cameras.  I can get a good feel for the eventual results of the study based on these comments (from CBS): 

Mahmassani said the Tribune study “had good points.” But, he said, “It’s generally known that, when you introduce red-light camera enforcement, you may experience some increase in rear-end, relatively minor crashes. Where you have reduction is the more dangerous sideswipe, right-angle crashes.”

I am curious on the approach that will be used here.  The cameras have been in place for 13 years depending on the intersection and the definition of a accident changed in 2009.  My own analysis did find a slight reduction in the 2010 study on angle and turning crashes.  So if right angle accidents generally result in $6,000 of damage versus $2,000 for a rear-end, the argument that Mahmassani will likely make is that its acceptable for accidents to rise, as long as the cost to society decreases.  My position then (and still now), is that the cameras have not show themselves to signficantly reduce accidents.  Although, I would like to see a methodologically rigrous study that could make an accurate determination of the true cost or benefits of the red light cameras.


Definition of an accident

For those who have read my studies, know that an important differentiator between the studies has been the time period of analysis.  In 2009, IDOT changed the definition of an accident and so it was unfair to compare accidents prior to the change with accidents after the change.  While this may seemed nuanced, it makes a big difference when lookinig at whether cameras reduce accidents.  Chicago was able to take advantage of this in their push for red light cameras.  In an article by the Daily Herald, they focus on this issue:

The article also notes the lack of public access to the data.


Red-light cameras began proliferating at suburban intersections in 2009 with the justification that they would prevent crashes. The same year, the Illinois Department of Transportation raised the dollar threshold necessary to report property damage crashes from $500 to $1,500. In one fell swoop, reported crashes shrank statewide by 30 percent -- from an average of 413,235 a year to an average of 287,718, IDOT officials said.

How much of the credit for reducing crashes should go to red-light cameras?

 . . 

The 2009 shift isn't the only problem in trying to objectively analyze red-light camera data. The Daily Herald's analysis also found an inconsistent system of reporting crash data to IDOT and lack of public access to statewide data.

That's why a number of experts and good government groups are calling for greater transparency and reforms in how crash data is reported.

IDOT updated the property damage standard in 2009 to reflect higher vehicle repair costs.

"As a result, a lot fewer accidents were reported," surveillance camera expert Rajiv Shah said.

The reduced crash rates partly deflected the public uproar over red-light cameras at that time.

But the decline in crashes did not carry through to 14 suburban intersections where cameras were installed after 2009.

One intersection had incomplete data, but at 10 of the 13 others crashes decreased in 2009 -- before cameras were in place at those locations.

After cameras were installed, crashes increased or stayed the same at eight of the 14 intersections, or 57 percent.

"I think everybody was a bit oversold on the promise of cameras being like that silver bullet and reducing crashes," said Shah, an adjunct associate professor of communications at the University of Illinois at Chicago who has studied and written about red-light cameras.

 . . 

Obtaining statewide data requires a Freedom of Information Act request and an in-person visit to IDOT offices.

"One of the primary objectives of making the post-installation reports accessible is transparency," IDOT spokesman Guy Tridgell said. "We make them readily available for members of the public interested in learning more about the impact red-light cameras are having on their communities.

"The Illinois Department of Transportation is currently exploring other options that could give us a better picture of how effective the cameras are at improving safety," he said.

The state of Illinois requires municipalities with red-light cameras to provide one- and three-year reports on the cameras. But the data is submitted on paper, not electronically. The law is vague, only requiring a statistical analysis "based upon the best available crash, traffic, and other data."

The law tells local officials to undertake additional studies if crashes have increased in lanes monitored by cameras. However, it leaves it up to their discretion whether to take action to rectify the problem. In a number of cases, the reports are prepared by red-light camera vendors.

Shah supports a comprehensive public database of crashes.

"It should be open to the public to review to be sure these cameras are indeed effective -- and if not -- remove them and put in place a warning program. It would be much more sensitive and transparent."