July | August 2017




by Katherine Barrett and Richard Greene
It may appear that efforts to adopt an evidence-based approach using data to improve the effectiveness, efficiency and fairness of law enforcement had its genesis back in 1995, when New York City kicked off work on its so-called CompStat system. In that very successful effort, geographic information systems, or GIS, were used to identify the places in the city where officers could be deployed to their best use.
It worked so well that New York’s crime rates plummeted and a number of other places tried to emulate the work. But while CompStat may have been at the forefront of using technology in this way, “the history of quantitative crime analysis spans decades,” wrote Jennifer Bachner, a director in the Johns Hopkins University Center for Advanced Governmental Studies.
As Bachner pointed out, in 1829 “an Italian geographer and French statistician designed the first maps that visualized crime data,” including three years of property crime rates as well as education information garnered from France’s census. The maps showed a correlation between the two—more education tended to equate to less crime.
Jump forward about 190 years and you’ll find that a number of states, counties and cities have been using the seemingly magical capacity of computers to advance this work dramatically. Gathering data to deal with law enforcement is becoming increasingly ubiquitous, and it goes far beyond the traditional crime rates that have been collected by the FBI for years.
Many states and localities have now started to gather and analyze all kinds of interactions between the police and the citizenry above and beyond simple arrest rates. Beginning next year, for example, California will legally require every law enforcement agency in the state to report all instances when a shooting takes place between an officer and an individual.
North Carolina is one of a number of states that has mandated reporting on “traffic stops that might reveal patterns of bias,” said Denice Ross, senior adviser to the Police Data Initiative, a community of practice for more than 60 law enforcement agencies around the country, which was established by the President’s Task Force on 21st Century Policing.
Alabama has mandated that all police departments in the state gather data about traffic stops of minorities. But it doesn’t end there. That information is reported every month to the attorney general and the Alabama Department of Public Safety, which can be on the lookout for trends that indicate there may be biases against minority groups in certain parts of the state.
One huge advantage of this kind of data gathering is the capacity of states and cities to disseminate information on a timely basis. “The FBI takes roughly nine months to publish the annual nationwide crime data,” said Ross. But she points to states like New Jersey, which electronically publishes its law enforcement data monthly, in order to make public the crime trends for citizens and policymakers.
Some of the challenges that must be conquered in order to make the best use of data in law enforcement were enunciated in Bachner’s paper for the IBM Center for the Business of Government, Predictive Policing: Preventing Crime with Data and Analytics. A handful include:
Of course, just gathering more and more information about crime is hardly a panacea. Ross said, for example, that New Jersey’s monthly reports are sent out in PDF form, which means that the data in them cannot be easily manipulated and analyzed. “The timeliness is a great step forward,” she said. “But it would be great if they put it out in some other way than as a PDF.”
What’s more, though technology can be extremely useful, it’s not the be-all, end-all. For one thing, it cannot replace other important elements of policing such as some of the basic notions of community policing like “nourishing neighborhood partnerships,” or “fostering a reputation characterized by legitimacy and fairness,” wrote former Berkeley Law Professor David Sklansky.
Then there’s the simple fact that no computer ever made a drug bust all by itself. Sufficient manpower is still critical. In the Albuquerque, New Mexico, area, for example, there’s such a shortage of police officers that even data power isn’t enough to make up for workforce shortages.
“In Albuquerque our police departments are at half force. And they’ve been trying to Band-Aid the problem with technology for a long time,” said New Mexico State Auditor Tim Keller, a 2014 CSG Toll Fellow. “But response times are up to 90 minutes now. If there’s a burglary, a cop may not show up until the next day.”

About Barrett and Greene

CSG Senior Fellows Katherine Barrett and Richard Greene are experts on state government who work with Governing magazine, the Pew Charitable Trusts, the Volcker Alliance, the National Academy of Public Administration and others. As CSG senior fellows, Barrett and Greene serve as advisers on state government policy and programming and assist in identifying emerging trends affecting states.