Study: bodycams increase some fatal police shootings

Originally published at: http://boingboing.net/2016/08/15/study-bodycams-increase-some.html

1 Like

In other words, “We found a way to make the numbers more interesting, so our study will get some attention before it’s entirely discredited.”

10 Likes

It’s not about shaving a few percentage points. It’s about accountability.

17 Likes

I’ve downloaded the paper, to be read later but:
Do those stats include having the bodycam on and working? or just having it, on you, but somehow not working…?

5 Likes

Those are some pretty small signals in a field with lots of noise. Reminds me of the war on traffic cams, saying they cause accidents.

6 Likes

Caused me an accident once. Lady in front of me may have been in the country illegally, I dunno. She saw a yellow light, in a city where that means speed up, and she STOOD on her brake pedal. Delta vee was so great the nose of my car went under hers and lifted it. I think she was terrified of talking to the police, but she got to anyway.

Is this an analysis of the study by you or are you just presuming dishonesty on the part of the researchers?

1 Like

And that’s the problem. In my town people barely tap their brakes at stops, and figure right of way means you can “probably” brake in time to avoid hitting them. If I have to choose between chaos and a learning curve to when people actually obey the law, I’ll take the latter. Before a local cam got pulled, I got several tickets for not coming to a complete stop for a right on red. Eventually I came to a complete stop and counted to 5.

Anyway, the cams didn’t cause your accident, the local crappy driving culture did.

5 Likes

One righteous person among all us ne’er-do-wells sure can um… make us not do well?

Uh, no. Correlation does not equal causation:

[quote=“boingboing, post:1, topic:83457”]
“Use of wearable video cameras is associated with a 3.64% increase in shooting-deaths of civilians by the police,”
[/quote](emphasis added)

If correlation is causation then writing for Boing Boing causes inaccurate headlines.

8 Likes

Just playing the odds.

To quote Richard Smith, former editor of the BMJ:
“Most scientific studies are wrong, and they are wrong because scientists are interested in funding and careers rather than truth.”

Believe It Or Not, Most Published Research Findings Are Probably False

Studies show many studies are false (Yes, they get the irony.)

Big Science is broken

And many, many others.

1 Like

They do, IIRC, in the aggregate . The catch is that they cause rear end accidents and reduce side impact collisions, which are more injurious.

Anecdotally I can confirm an increase of +1. I was reared ended years ago when I chose to stop for a yellow that I would have proceeded through if there hadn’t been a red light camera.

4 Likes

So again, cops need better training and policies in deescalation techniques and respect for the lives of suspects, regardless of whether the cops are wearing body cameras or not. Let’s go with that and see if it reduces fatal police shootings. I’m willing to be proven wrong with evidence.

6 Likes

Perhaps precincts where bodycams have been deployed did so in response to increased use of deadly force by those police compared to other precincts? I would feel less of a need for them if there wasn’t already an issue in the first place. Kind of like saying saying a breathalyzer installed in your car makes you more likely to try driving drunk.

5 Likes

This is pure statistical analysis and as such has no intrinsic meaning. A statistical correlation may indicate a causal relationship, in either direction, or causality by a third, unaccounted for factor, or a selection effect, or may disappear altogether in a regression to the mean (very likely given the small size of the effect and the fact that the p values are miserable).

Since there is no evidence it has even been submitted for peer review, not really impressed.

10 Likes

Count me among the skeptics on this article. The signal detection theory bit is about a model for when the police shoot and not about how they did the statistical analysis. It’s a model of how you act under uncertainty, trying to avoid both the deaths of bystanders and of the suspect. They are assuming that the police are using a optimal threshold for this decision… for some reason? (Maybe because they are in a biz. school and those people always assume optimality/equilibrium. ***)

It seems like the goal of the discussion is model and bolster their assertion that more information (in the form of crime statistics) leads to less shootings, as expressed below:

First, we found that in police departments that conduct statistical analyses of digitized crime data, there are 2.15% fewer fatal shootings, substantiating our theoretical prediction that criminal intelligence can prevent police officers from using lethal force.

*** Which brings the question of why biz. school people are writing about police shootings. Seems like:

7 Likes

I haven’t read the full paper, but it looks like the statistics are a big pile of crap that wouldn’t survive the first contact with peer review by anyone with any understanding of statistics.

They have 30 variables they are testing against, and they are analyzing 4 different data sets (130 trials). (Table 5 & 6). Then they went the full monty with a a 9 x 12 grid (108 more trials, Table 7). Their tables flag with asterisks at the P<0.1, 0.05, and 0.01 levels. When you are testing against 238 conditions (and doing further cherry-picking) you should just ignore everything that doesn’t at least hit P < 0.0001.

I couldn’t find the total number of shootings while scanning the paper, but it looks to be 968 in the bologna slice that showed the biggest effect. 27.4% of the police departments they looked at had body cams.

In that case, you expect 702 shootings in the non-bodycam precincts, vs 265 in the bodycam precincts.
Take a claimed effect size of 3.64% more in the bodycam, and you are looking at 10 extra shootings above that 500.

By random chance, a 1-sigma variation for N=265 is sqrt(265) ~ 13. Thus the claimed effect size is less than a 1 sigma variation (which happens about a third of the time by random chance). With 238 trial conditions, I am surprised that they didn’t find a bigger effect somewhere.

Add to that the fact that the effect is negative in some of their data slicing, and that dash cams have the opposite effect in many of their slices, and it’s time to stop taking the report at all seriously.

6 Likes

Body cams will of course increase documented fatal police shootings. I wonder how they document the undocumented shootings?


Listen to the podcast linked there. It’s very good.

Also:

Interestingly, sometimes it can, under the right conditions.

"For this new study statisticians focused on the simple cases of X and Y, two lone variables that are definitely linked—except we do not know whether X causes Y, or Y causes X. One way to solve that problem would be to run an expensive study that controls all outside variables and hones in on X and Y.

Or we could tap into additive noise model testing.

Additive noise model testing is based on the simple assumption that there is always some statistical noise clinging to the key variables in any experiment—areas where the data becomes fuzzy and unreliable due to measurement errors. Regardless of any link, each variable will have its own unique noise signature, with one caveat: If X causes Y, then the noise in X will be able to contaminate Y, but the noise in Y will not able to do the same to X. Because a cause can affect an effect, but an effect cannot affect a cause (read that last line a few times)."

http://www.vocativ.com/335705/correlation-causation/

Full paper (which I have only started to read) here: http://www.jmlr.org/papers/volume17/14-518/14-518.pdf