police-brutality

Requiring police to wear body cameras is still controversial in some districts, but the frequency of such requirements has exploded in recent years, thanks in part to high-profile cases of misconduct recently caught on tape. In many cases, these cameras are being rolled out before policy for their use has been finalized. As in many other areas, technological capability has already taken a major step past policy.

With the help of specialized data mining solutions, some jurisdictions are eyeing far more insightful measures — crucially, those that can predict and prevent misconduct, rather than simply document it. FiveThirtyEight has a great rundown of the tortured, largely political history of these algorithms, and the battles they’ve started between governments and police unions. In particular, they’ve fallen afoul of the idea of criminal, and just personal, culpability.

Crime mapping is just one, relatively non-invasive, method of applying analytics to the population at large.

Crime mapping is just one, relatively non-invasive, method of applying analytics to the population at large.

The problem with this is that it leads to officers being treated differently based on actions they have yet to take, and might never take at all. Overall, this is the problem foreshadowed by Minority Report.

Unlike the fictional, psychic pre-cogs, however, this technology begins so simply it might seem almost useless: the first, and still one of the strongest indicators of future misconduct is… past misconduct. Deciding to divert special attention toward officers with a history of complaints is not exactly rocket science, but it was still quite controversial when first introduced. Modern versions of these solutions are proposed to be far more invasive, and potentially far more useful. They can take into account the nature of recent calls for each officer, for instance if one person happens to have responded to a string of stressful situations, like domestic abuse calls, all in a row. They can also loop in overall lifestyle attributes.

Should an algorithm aimed at preventing violent misconduct be allowed to take into account known information about an officer’s financial or marital situation? Statistically, these are very useful predictors of behavior, implying heightened levels of stress and lowered impulse control. Police organizations see the collection of such information as invasive, and any actions taken on the basis of these predictions as preemptive punishment. The psychologists and engineers who designed the system would frame their interventions — like mandatory counseling — as services provided to the officer, but of course the officers themselves are unlikely to see it that way.

A simple model for a neural network.

A simple model for a neural network.

The biggest problem with the defensive argument coming from police organizations is hypocrisy. Many police organizations have been willing to undo privacy protections or take preemptive action against suspects based on the insight they glean from these ill-gottendatasets. Predictive analytics are a huge part of modern policing, and becoming bigger by the day — even in pursuit of ending crimes far less serious than police misconduct.

If stationing cops on a statistically dangerous corner and by definition increasing the arrest rate in a particular area isn’t pre-judging those residents for crimes not yet committed, then requiring a talk about emotions based on very real dangers to the public simply cannot be, either.

Cops actually signed up to be civil servants, accepted with their guns and extra liberties some extra oversight as well. Holding them to a higher standard and gathering additional information on their actions can protect both police officers and the communities they serve.

http://www.extremetech.com/extreme/224560-new-analytics-can-predict-and-possibly-prevent-police-misconduct