Skip
Current Issue
This Month's Print Issue

Follow Fast Company

We’ll come to you.

3 minute read

Civil Rights Groups Say Predictive Policing Isn't Doing Much Predicting Or Policing

A Minority Report world where police stop crime before it starts might sound ideal, but police are already using algorithms to help find criminals—and it isn't working out well.

[Illustrations: Goettingen/iStock]

In the past, police officers on neighborhood beats might be able to rattle off some names if you asked them who is most likely to be involved in a future shooting. But today in Chicago, this is now a computer’s job.

Since 2013, computer software has crunched crime data to spit out what is known as Chicago’s "strategic subject list," now a list of 1,500 people who are most likely to either be assailants or victims in a shooting. The Chicago Police Department’s goal is to have both police and social workers visit people on the list to try to prevent tragedies before they happen.

This seems laudable on the surface, but it also raises many questions for civil rights, civil liberties, and technology policy groups, who issued a statement and report calling out the flaws with these kinds of "predictive policing" programs. Some programs, like Chicago's pinpoint people, others are meant to target hotspot neighborhoods at certain times.

"The ACLU’s chief concern with predictive policing is, simply put, garbage in, garbage out," says Ezekiel Edwards, director of the ACLU's criminal law reform project. It signed the statement along with groups including the NAACP, Color of Change, the Electronic Frontier Foundation, and 13 other groups.

A non-exhaustive map of police departments using or considering predictive software.

Crime data is known to be biased, heavily influenced by subjective enforcement and reporting patterns rather than objective crime rates. Relying on this data, the statement says, perpetuates and may even amp up the racial discrimination that is reflected in crime data of the past. There are also few regulations or guidelines for how predictive policing should be used—the groups worry the software could increase the likelihood someone will be accused of a crime or prosecuted more aggressively. Worse, few if any of the dozen or so manufactures of this software reveal how exactly the algorithms work.

Whether these programs are effective is also dubious, based on early evidence.

"In city after city, predictive policing programs have failed every test and have succeeded in violating basic civil rights," says Malkia Cyril, executive director of the Center for Media Justice.

In a RAND study of the Chicago program published in early August, for example, researchers concluded that the list was ineffective in identifying potential violence victims or reducing crime. It did, however, help police arrest people after the fact—making it at best a "data-driven most wanted list," as law professor Andrew Ferguson told the Verge, and at worst a vehicle for wrongfully targeting certain individuals.

The report, released by Upturn, a technology policy group, details many of these concerns and the spread of these programs. Through surveys and public reports, it found that at least 20 of the nation’s 50 largest police departments have used predictive systems and 11 are actively exploring options.

The systems are run by private companies and sold under names like "PredPol," "Hunchlab," and "Beware." Predictive policing itself is a marketing term that gives greater weight to the software’s accuracy and effectiveness than it deserves, the report says. Wade Henderson, president of the Leadership Conference on Civil and Human Rights, calls it "fortune-teller policing." The report suggests that forecasting might be a better term.

The groups call for predictive policing programs to be paused while these issues are more closely examined. They also say big data programs could be put to better use by the police—such as pinpointing officers who have a history of complaints against them or finding people who need mental health services.

"Police should be at least as eager to pilot new, data-driven approaches in the search for misconduct as they are in the search for crime, particularly given that interventions designed to reduce the chances of misconduct do not themselves pose risk to life and limb," the groups write.

Have something to say about this article? You can email us and let us know. If it's interesting and thoughtful, we may publish your response.

loading