Facial recognition leads to wrongful arrest of Black man in Detroit

Originally published at: https://boingboing.net/2020/06/24/facial-recognition-leads-to-wr.html

7 Likes

Wow, who could possibly have foreseen this? Especially in the face of multiple studies showing this kind of software is prone to increased error when dealing with PoC? And the company name “Rank One?” That’s just a little too on the nose! (Ahhh, a pun I did not intend!)

27 Likes

A racist society creates racist software.

19 Likes

No surprise. Human witnesses make the same kind of mistake all the time. After all, they all look alike.

4 Likes

Maybe they should have double-checked with a polygraph test? /s

8 Likes

Sadly, this mis-identification is still a better track record than actual human eye witness identification, which, IIRC, is what leads to most wrongful convictions.

4 Likes

Facial recognition hasn’t been around long enough to assess the false conviction rate.

11 Likes

what-could-possibly-go-wrong

3 Likes

I remember when A.I. crime predictive software was coming out it was criticized as being a possibly racists.

And cops were saying, “Oh right computers are racist now, too.”

Fucking fucks yes they can be.

6 Likes

In the sense that we know it doesn’t even remotely work for Black people, yeah. I’m shocked anyone thought it was a good idea, unless the whole point was the enormous false positive rate for matches. (Because it gives them an excuse to arrest Black people at will. They matched!)

6 Likes

Donating to ACLU now. This is a very important case.

3 Likes

Here’s a notion: use the face reco as an unreliable eyewitness: cops can call up the guy (or stop by) and say “hey someone says you were at this place at this time and a crime happened, we’re following up and want to know more” then he would say something like “I was at work, and have many eyewitnesses”, or “F off, charge me or leave me alone”. At that point they would have to do some crime-solving vs just grabbing people off the streets.

The face-reco should be only one part of a larger stack of evidence, but there’s not much interest in that sort of crime-solving most days.

3 Likes

Although the facial recognition racial bias problems are different ones. They’re really bad at recognizing the gender of Black women, that photos of Black people even show people, that Black people are different people (it tends to match faces to whatever photos are in the database)… i.e. pretty much the problems that a racist police system already has with Black people. If the facial recognition systems mis-identified faces as white, that would actually be a big improvement over the reality.

2 Likes

On two occasions I have been asked, ’ Pray, Mr. Babbage, if you put into the machine racist figures, will unbiased answers come out? ’ I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.

1 Like

Like the cops need more excuses to harrass Black people. No thanks.

2 Likes

Does anyone here have knowledge of how the cop’s facial recognition devices are set up? That is… is there a non-erasable record kept of the device/system’s “hits”? Otherwise, cops who racially profile can get away with blaming the device/software for errors and push blame on the device makers.

1 Like

Even if they got the thing working, imagine what a nightmare it would be to be a person of color who happened to have an identical twin sibling who had a warrant out for their arrest. You wouldn’t be able to leave the damn house.

Texas has a pair of brothers who trade their IDs all the time, in order to be released on a technicality.

Not long until someone gets SWATted as the result of a bad match. At which point… nothing will change(judging from past atrocities)

1 Like