- Washington Free Beacon - https://freebeacon.com -

Which Algorithm Watches the Watchmen?

Cameras, connected to facial recognition software, watch every corner in lower Manhattan. A "heat list" helps police pick out Chicago citizens who may be involved in a future shooting so they can pay a preemptive visit. Information culled from social media, shopping habits, and phone calls is used to build a precise profile of you, and to pinpoint your involvement in any crime.

This is the present and future of policing as Andrew G. Ferguson paints it in The Rise of Big Data Policing. Big data—the combination of adequately fast processing, adequately large amounts of storage, and major advancements in the domains of machine learning and big data analytics—allows computers to draw out previously unseen patterns in culture, politics, commerce, and, of course, crime.

The 2008 financial crisis and concurrent drop in police budgets, protests across the country after events in Ferguson, Mo., and police officers demanding new tools to cope with an increasingly stressful job all combined to drive police departments into the 21st century. The aforementioned tools now give police unprecedented understanding of who commits, or will commit, which crimes where. These tools provide real possibilities for reducing spiking violent crime, including, as Robert Verbruggen argued in the wake of the Las Vegas massacre, gun violence.

At the same time, Ferguson repeats over and over, these new approaches to policing carry with them substantive and often unseen risks. Algorithms can and do encode biases, giving bad assumptions the sheen of scientific objectivity. Algorithmic suspicion—investigation based on a big data claim that someone is, say, 70 percent likely to commit a crime—opens up new problems in the jurisprudence of reasonable suspicion and search.

Such issues are especially evident where race is concerned. Seemingly race-neutral predictors of criminality—location, arrest record, incarceration record, etc.—are inescapably intertwined with race. As Columbia professor and critic of predictive policing Bernard Harcourt writes, "risk is a proxy for race." These concerns are the dark downside to the predictive revolution in policing.

In spite of these risks, Ferguson is not willing to eschew big data altogether. There are good and bad instances of surveillance, he contends, and telling the difference will require caution, but we should not throw the proverbial baby out with the bathwater. Ferguson's moderacy is commendable, especially insofar as he writes in a time when people are all too prone to hyperbole.

At the same time, Ferguson's corrective recommendations seem inadequate to the revolution that big data implies, and the concurrent criticism that kind of revolution demands.

Something is illuminated by one of the many stories Ferguson tells about big data's use in policing. In San Francisco, the police implemented predictive software that tabulated statistics and then flagged on a map areas where crime was likely to occur, marking convenient "red boxes" for officers to proactively patrol. "Apparently," Ferguson writes, "police officers, instructed to patrol the predicted area, never left." The department had to implement a "think outside the box" campaign to discourage over-policing of red box areas.

A similar effect occurred in another study, conducted by the RAND corporation in Shreveport, La. "Researchers observed that patrol officers became more like detectives trying to figure out the crime drivers in the predicted area," Ferguson explains.

In policing, as in most human affairs, knowledge and action are strongly related. What he knows and how he knows it determines how the police officer chooses to act. The form of knowledge that big data policing takes is at once macro-scale—camera systems like Manhattan’s Domain Awareness System mean that "the normal limitation of human observation and memory become all but irrelevant," Ferguson writes—and micro-scale, allowing Chicago police to fine-tune their targeting to just a handful of people through the "heat list." In San Francisco and Shreveport, officers worked the micro-scale in order to produce a more refined view from the macro, a meatspace implementation of a machine learning algorithm relying on hundreds of willing data gatherers.

What is missing from this equation—what is structurally precluded by this equation—is the police officer's role as embedded feature of his community. Being able to see at mass scale necessarily requires the officer to see from without, to observe the area he is responsible for policing from a bird's eye view rather than from within it.

George Kelling and James Wilson, in their seminal and oft-misunderstood article "Broken Windows," argue that the historical role of the police officer is, more or less, as an arm of the community, intended to keep the peace rather than rectify crimes. "From the earliest days of the nation, the police function was seen primarily as that of a night watchman: to maintain order against the chief threats to order—fire, wild animals, and disreputable behavior. Solving crimes was viewed not as a police responsibility but as a private one," they write.

Over time, however, the community enforcement police was transformed. The force was steadily professionalized, and policing became a more analytic, investigative profession, of the type we watch on shows like Law and Order today.

The single-variable optimization of this investigative capacity is clearly, causally linked to the rise of a supporting knowledge structure which conforms with that optimization. The ideal form of that structure, unbounded by human limitations, is the emerging predictive policing regime, managing enormous populations with micro-specificity thanks to adequately complex systems at the right scale.

Supporters of big data in policing might protest that nothing about their approach mandates that officers become less involved with their communities. However, when the tools epistemically disconnect officers from their beats, it is hard to imagine that some amount of community policing is not being sacrificed. If officers must be reminded to "think outside the box," we should realize that they are being strongly encouraged to think within the box in the first place.

The Rise of Big Data Policing is absolutely a valuable foundation for understanding how prediction has suffused policing, in much the same way it has suffused the rest of society. Ferguson's criticisms are cogent, but more importantly he communicates clearly a broad factual picture of the situation as it stands today.

What should give us pause about living in an increasingly big data world, however, is not merely that these tools can be misused. It is more important to realize that big data changes the way that we think and the modes of possible action. The tendency of policing is towards abstraction and professionalization, and away from a connection to community the lack of which seems to prompt so many concerns. Until we understand this tendency, body cameras and algorithms may do more to exacerbate than resolve the problems big data was meant to fix in the first place.