Juvenile Justice Big Data in the Era of Big Policing

Man sitting at a desk working flat screen computer with his fingers, graphs of numbers and other numerics superimposed over the image

ESB Professional/Shutterstock

WASHINGTON — Big data has already come to big city policing. The technology may be new, but some juvenile justice advocates worry that it may already be compromised by an age-old tech problem: Garbage in, garbage out.

“You definitely see the vast majority of people who are being targeted by person-based, predictive policing are young people — young people of color, young people in poor areas, in minority areas,” said Andrew Ferguson, a former public defender who teaches law at the University of the District of Columbia. “The idea of trying to identify people who are at risk and trying to do something about it — that's legitimate. The question is, is policing really the remedy you want?”

Ferguson has just published a book, “The Rise of Big Data Policing: Surveillance, Race and the Future of Law Enforcement.” In it, he traces the contours of what he sees as an emerging set of challenges for the justice system. Big city police departments, he says, hard pressed to fight crime on limited budgets, are turning to data analytics companies for help. The companies are providing reams of data on ordinary citizens (mostly youths)  — but instead of changing attitudes toward locking up youthful offenders, the companies seem to be hardening them, Ferguson said.

“If you have identified someone who's at risk, you have a couple of choices about what you could do. You could send a police officer to their door like they do in Chicago, you could call them in for a sort of ‘scared straight’ lecture like they do in Chicago or you could actually offer them the social services — the job opportunities, the educational opportunities — to get them out of that life and deal with the risk factors,” Ferguson said.

Ferguson’s arguments are timely — and important, said Naomi Smoot, executive director of the Coalition for Juvenile Justice, a Washington, D.C.-based nonprofit group that advocates for juvenile justice reform. (Smoot studied under Ferguson at UDC.)

“We need to make to sure that the data we're putting in, in the front is the best possible data to guarantee that we're getting best possible outcomes for our kids,” she said. “I think when you're putting in data based on a system that has discriminatory outcomes and you're using that data to predict outcomes, we need to clean the data first.”

Ferguson’s and Smoot’s worries aren’t merely speculative:

  • Chicago has used advanced analytics since 2012 to come up with what its police department calls the “Strategic Subject List” that claims to identify people who are at risk of violence — either as a perpetrator or as a victim.
  • In 2016, Pittsburgh police began using Big Data to come up with predictive “crime maps” and concentrate patrols accordingly.
  • New Orleans recently scrapped its own predictive policing contract when it emerged that its department had secretly signed a deal with a powerful defense contractor.
  • Residents of Baltimore County, Maryland, are still arguing about an aborted police contract with a company that literally sent a spy plane over the city.

Advocates of predictive analytics say the technology gives police the best chance of acting proactively against crime instead of merely reacting.

Ferguson is not so sure. A D.C. consulting firm, for instance, tried to break down Chicago’s still-secret “heat list” algorithm by running a regression analysis.

“And they found that the only real, correlating factor was age,” Ferguson said. “In some ways, all the heat list was giving you was, like, young people were more likely to be involved in violence than other people. That was the reveal, which isn't a reveal.”

Practitioners — especially lawyers for youths in the juvenile justice system — already should be pushing for discovery of any data analytics behind a youth’s arrest, Ferguson said.

“We sort of lose control of that data,” Ferguson said. “Once it becomes a quantification issue, it becomes something we check off. 'All right, he's got four red flags, three yellow flags, we're putting him in this box.'”