For the last few months, I have been working on a project using Chicago's crime statistics. Its great the city has finally published them and made them available to the public. (I was pushing for this many years ago, when the city would only share 30 days of crime data.)
I want to highlight a couple of issues with the crime dataset. First, the city deliberately obfuscates some of the published data for issues of public safety. An example of this is adding noise to locations of crime. Second, there is a strong bias by the police for under reporting crime. This is a response to the CompStat philosophy of emphasizing quantitative progress on crime. Police have an incentive to underreport crimes in order to make themselves look good. This is a natural response (and expected response) to an emphasis on these quantitative measures. In many ways, its analogous to how teachers focus on teaching students to do well on standardized tests.
There are several recent stories that led me to discuss this issue. The first is an in-depth investigation by Chicago magazine on crime statistics. The article also relies on the study by Chicago's Office of Inspector General. It appears to apply a very careful analysis to the crime statistics and incorporates a number of cases to illustrate how crime statistics are manipulated. Here are some snippets from the article:
Chicago Magazine conducted a 12-month examination of the Chicago Police Department’s crime statistics going back several years, poring through public and internal police records and interviewing crime victims, criminologists, and police sources of various ranks. We identified 10 people, including Groves, who were beaten, burned, suffocated, or shot to death in 2013 and whose cases were reclassified as death investigations, downgraded to more minor crimes, or even closed as noncriminal incidents—all for illogical or, at best, unclear reasons.
This troubling practice goes far beyond murders, documents and interviews reveal. Chicago found dozens of other crimes, including serious felonies such as robberies, burglaries, and assaults, that were misclassified, downgraded to wrist-slap offenses, or made to vanish altogether.
Take “index crimes”: the eight violent and property crimes that virtually all U.S. cities supply to the Federal Bureau of Investigation for its Uniform Crime Report. According to police figures, the number of these crimes plunged by 56 percent citywide from 2010 to 2013—an average of nearly 19 percent per year—a reduction that borders on the miraculous. To put these numbers in perspective: From 1993, when index crimes peaked, to 2010, the last full year under McCarthy’s predecessor, Jody Weis, the average annual decline was less than 4 percent.
This dramatic crime reduction has been happening even as the department has been bleeding officers. (A recent Tribune analysis listed 7,078 beat cops on the streets, 10 percent fewer than in 2011.) Given these facts, the crime reduction “makes no sense,” says one veteran sergeant. “And it makes absolutely no sense that people believe it. Yet people believe it.”
The city’s inspector general, Joseph Ferguson, may not. Chicago has learned that his office has questioned the accuracy of the police department’s crime statistics. A spokeswoman confirmed that the office recently finalized an audit of the police department’s 2012 crime data—though only for assault-related crimes so far—“to determine if CPD accurately classified [these categories of] crimes under its written guidelines and if it reported related crime statistics correctly.” (The audit found, among other things, that the department undercounted aggravated assaults and batteries by more than 24 percent, based on the sample cases reviewed.)
All of this creative number crunching, former police officials say, is a radical departure from past practices. Veteran members of the force blame McCarthy. Muddling murder statistics “benefits no one but the superintendent,” says the retired high-level detective. “Not the citizens, not the investigators. It only benefits him.”
The second article I want to mention (for those of you still with me) is a blog post by Corey Yung. Corey explains how the manipulation of crime statistics is hardly unique to Chicago.
In the mid-1990′s, the Philadelphia Inquirer caught the local police gaming the rape statistics sent to the FBI. The city police would regularly classify rape complaints as “investigate persons” without further inquiry. As a result, the city was able to announce lower violent crime rates based upon faulty data. In 2005, the St. Louis Post-Dispatch uncovered similar practices in St. Louis. There, the police used informal memos instead of written complaints to record allegations of rape. These memos were not counted in official crime numbers. The police even pressured victims to sign waiver forms releasing police from any obligation to further investigate their complaints. In 2009, the Times-Picayune and Baltimore Sun found large-scale rape data manipulation in New Orleans and Baltimore. The Baltimore police took advantage of the “unfounded” rule wherein police do not have to count criminal complaints deemed false. However, the department regularly used the category with little or no investigation performed. New Orleans police repeatedly downgraded offenses to crimes that were not counted in official stats. According to the investigation, over half of New Orleans rape complaints were designated as “Signal 21″ which was a non-criminal category where rape cases went to die.