The Push to ‘Predict’ Police Shootings

Tracking officers’ stress exposure and body-camera practices could help keep them from pulling the trigger.

But the push to use body cameras on police now has a surprising source: the camera industry itself. Late last month, Axon, the No. 1 manufacturer of body cameras in the United States, announced its Performance tool, which is seemingly targeted at the long line of high-profile body-camera failures.* The tool, a paid upgrade for current customers, is a dashboard that quantifies how often officers turn their cameras on during calls and whether they categorize videos correctly.

Axon’s announcement came just a day after a jury convicted the Minneapolis police officer Mohamed Noor of shooting and killing an unarmed civilian, the Australian-born yoga instructor Justine Damond. The case is among the most high-profile incidents of police violence that involved body-camera failure.

While both Noor and his partner wore body cameras the night Damond was shot, neither was turned on to record the shooting. Shannon Barnette, the sergeant on duty and Noor’s supervisor, was the first to arrive on the scene after Damond’s death. Footage from her body camera is split into three parts. In the first, she drives to the scene. In the second, she approaches another officer; says “I’m on,” presumably referring to her body camera; and then turns the camera off. The footage resumes two minutes later. Prosecutors asked Barnette why her camera was turned off, then on.

“No idea,” Barnette responded.

Barnette testified that the department’s policy on when the cameras are supposed to be on was “not clear” at the time. Since the shooting, the Minneapolis Police Department has revised its policy: The cameras stay on.

Andrew Ferguson, a professor at the University of D.C. David A. Clarke School of Law, studies what he calls “blue data,” information collected from police-officer activities that can then be used for police reform. Specifically, he’s interested in police “resistance” to being surveilled, drawing a direct comparison between the predictive analytics used on police and those used on citizens.

“Police officers are the first ones to say, ‘Hey, that’s unfair that I’m not gonna get this promotion, because some algorithm said I might be more violent or at risk than someone else,’” Ferguson says. “And you want to turn around and say, ‘Exactly. It’s unfair that some kid gets put on a heat list because he lives in a poor area and he’s surrounded by poverty and violence.’”

Lauren Haynes, the former associate director of the Center for Data Science and Public Policy at the University of Chicago, helped design a statistical model to predict when officers may become involved in an “adverse event,” anything from a routine complaint up to an officer-involved shooting. The project didn’t use the kind of body-camera data that Axon’s new tool works with, but she says they’re “absolutely something that could be put into the model.”

Sammy Singh

Graduate of UCLA and Wharton School of Business and Media Personality. World renowned global entrepreneur, venture capitalist, financial technology professional, tax specialist, marketing mogul, and more! Connect with me at: www.linkedin.com/in/cfo www.instagram.com/champagnegqpapi www.facebook.com/sammysinghcxo www.twitter.com/cxosynergy

Next Post

Tech Is No Match for Human Grossness

Wed May 8 , 2019
<div>Silver-infused linens may kill microbes, but there’s no way to avoid doing your laundry.</div>
%d bloggers like this: