August 31, 2016
Policing

“Predictive policing” is happening now — and police could learn a lesson from Minority Report.

David Robinson

Article

In the movie Minority Report, mutants in a vat look into the future, and tell Tom Cruise who is about to commit a crime, so he can arrest the offender before the crime happens. Spoiler alert: Those mutant fortune tellers turn out not to be infallible, but the cops treat them as though they were. Law enforcement’s blind faith in a tool that doesn’t always work — a tool that can easily finger the wrong person, with terrible results — provides the central tension for that blockbuster film.

Real police are now at risk of making a similar mistake. But this time the results are in the street, not at the box office. Today’s cops don’t rely on soothsayers, but they do increasingly use software to forecast where future crimes may happen, or who may be involved. And they are nowhere near skeptical enough of the forecasts those computers are making.

Today, a national coalition of 17 advocacy groups is raising the alarm about this, with a shared statement highlighting six ways that this trend threatens civil rights.

Civil rights sign-ons

Civil rights sign-ons

These groups all agree — the current rush toward predictive policing is wrong.

Upturn, where I work, helped draft the new statement, and today we’re releasing a report designed to empower you to get beyond the hype and make up your own mind about what the industry calls “predictive policing.” As lead author of that report, here’s what I’d like you to know.

Police can easily trust these tools too much.

People often overestimate the accuracy, objectivity, and reliability of information that comes from a computer, including from a predictive policing system. The RAND Corporation, which has done the best studies to date, is a famously buttoned-down kind of place. But they’re just as bothered by this problem as I am. They write: “[p]redictive policing has been so hyped that the reality cannot live up to the hyperbole. There is an underlying, erroneous assumption that advanced mathematical and computational power is both necessary and sufficient to reduce crime [but in fact] the predictions are only as good as the data used to make them.”

Police data about crime paints a distorted picture, which can easily lead to discriminatory policing patterns.

These systems only predict which crimes police will detect in the future — and that creates a distorted picture. As an eminent criminologist once said, “[i]t has been known for more than 30 years that, in general, police statistics are poor measures of true levels of crime.”

In the context of predictive policing, statistics generated by the policing process are often treated as though they are records of underlying criminal behavior. But these numbers are a direct record of how law enforcement responds to particular crimes, and they are only indirect evidence about what is actually happening in the world. Criminologists argue that “[a]rrest, conviction, and incarceration data are most appropriately viewed as measures of official response to criminal behavior.”

Video review

Video review

Of course, it makes sense for police to be responsive to community needs. Different communities served by the same police department often do have different needs, and different levels of need. That means officers will see more of what goes on in some neighborhoods than they will in others. But it is dangerous to treat the results of that process as though they were a completely neutral reflection of the world.

As data scientist Cathy O’Neil explains, “people have too much trust [that] numbers [will] be intrinsically objective.”

Rather than changing their tactics, police who use predictive tools have tended to focus on generating more citations and arrests. I read everything I could find about how police are actually using predictive policing tools. The consistent answer was that police aren’t being guided toward different or more humane tactics. Instead, where the computer says to focus, the police do more enforcement. Which worsens the data problem.

Data could be used in ways that strengthen civil rights, but we’re missing that opportunity.

This, to me, is one of the most exciting things we found in our research. To quote from our report:

In most of the nation, police currently measure outcomes and assess performance based on only some of the activities, costs, and benefits that matter in policing…

Serious violent crimes will always be important. But violent crime doesn’t reflect the full scope of community concerns…

[E]xperts on police performance measurement have long argued that police should track all uses of coercive authority so they can better promote public safety with minimum coercion…. And research on police performance measurement consistently calls for surveying victims to gather their feedback on the police officers with whom they interact.

Beyond the basic goal of constitutional, lawful policing, measuring factors like these could allow the police to track and reward strategies that do a better job of balancing a community’s needs and interests. As a White House report recently found, “if feedback loops are not thoughtfully constructed, a predictive algorithmic system … could perpetuate policing practices that are not sufficiently attuned to community needs and potentially impede efforts to improve community trust and safety.” In other words, police and the systems they use might gather data, make predictions, and base decisions not only where future crimes may be found, but also on the need to economize their use of authority, maintain community trust, and provide positive experiences for crime victims.

Predictive policing tools are being widely adopted before their impact can be measured, with little transparency and, often, no public engagement.

In our survey of the nation’s 50 largest police departments plus select others that have considered these technologies, we found that at least 20 of them have used a predictive policing system, and at least 11 more are looking into doing so. Yet only a few cities have seen a meaningful public debate about how these technologies operate or what policies control their use. In fact, in our search for public information about how police use these systems, Chicago was the only place with any policy governing the use of predictive policing at all.

Today’s tools may not make anyone safer.

Our report sums up the evidence:

We are currently aware of two rigorous, scholarly studies of predictive policing in the United States whose authors have no interest in the success of the method being evaluated. Both of these were conducted by the RAND Corporation. Neither analysis found any safety benefit in the predictive policing tools studied. In Chicago, evaluating an early version of the city’s person-based Strategic Subject List, RAND found that the effort “does not appear to have been successful in reducing gun violence.” [And in] Shreveport, Louisiana, RAND evaluated a tool that the police department had developed in-house, and found “no statistical evidence that [the program] as implemented reduced crime or any of its subtypes.”

In the full report, we also review several studies conducted by people who build predictive policing tools. Unsurprisingly, several of those find a public safety benefit.

A new approach to data in policing is needed.

Our report recommends:

  • More independent, rigorous validation of predictive techniques

  • The inclusion of more data that reflects community priorities in these systems

  • More public engagement about the risks and benefits of these systems at both a local and national level

  • Informed public approval before these systems are deployed.

Though these systems are rolling out in police departments nationwide, we found pervasive, fundamental gaps in what’s publicly known about them. For example, how these tools work and make predictions, how they define and measure their performance and how police departments actually use these systems day-to-day, are all unclear. On top of that, vendors routinely claim that the inner working of their technology is proprietary, keeping their methods a closely-held trade secret, even from the departments themselves. All of this evidences a trend of rapid, poorly informed adoption.

We believe that positive uses of data could improve police practices in the future. But we found little evidence that today’s systems live up to their claims, and significant reason to fear that they may actually reinforce disproportionate and discriminatory policing practices.

. . .

For more about predictive policing and civil rights, take a look at our full report, together with today’s statement.