Originally published at: Faulty facial recognition lands pregnant woman in jail for carjacking | Boing Boing
…
This is the most appallingly dystopian thing I’ve read this morning. Reminds me of Terry Gilliam’s “Brazil”.
It’s not as if the fact that facial recognition technology is racist can be a surprise to anyone at this stage. If you are justifying doing racist shit on the basis that “the algorithm” told you to it “AI” told you to, you are using technology as a fig leaf to hide your obvious racism.
So the facial recognition found a woman who looks like the carjacker, an analyst confirmed they look the same and the carjacking victim picked the woman from a six-pack photo lineup. Mind, a woman he’d had sex with prior to being jacked.
I know people want to blame facial recognition technology, but it seems like the more fundamental issue is that even positive facial identification by witnesses shouldn’t be enough to charge someone with a crime.
Doppelgangers exist. I once had a woman approach me to confirm I wasn’t her son. She showed me a picture and sure enough, he looked just like me. It was eerie.
This case was especially egregious though since the victim probably wouldn’t have picked the heavily pregnant woman had he seen her in person.
THIS dangerously shoddy technology. We knew about all the other dangerously shoddy technology.
Facial recognition is fairly OK on white faces, useless on non-white facrs, and should never be grounds for arrest.
Tfa makes the point that if software is let loose on a large enough dataset of faces, it is quite likely to find one that is a match. Because doppelgangers exist.
Mostly this seems to be an instance of PEBKAC. (Problem exists between keyboard and chair.) And whoever developed and sold this tool to the Wayne County police to begin with is the problem too, given this obvious emergent behavior!
Before @Otherbrother grew a beard Facebook’s facial recognition algorithm would frequently automatically tag me in the wrong pictures. I can only imagine what it would be like for a law-abiding person to have a criminal twin in a society where facial recognition from crappy security cameras was enough basis for an arrest.
There is a catch tho, and that is the facial recognition database used by the police was composed entirely of mugshots. She was in the system due to a minor offense (driving w/o a license).
Even if the facial recognition system was perfect (which it isn’t) and didn’t have accuracy issues with darker skin tones (which it does) it would have spit out the 6 closest matches in the database. If the carjacker wasn’t in the system but looked a lot like this woman, it still may have picked her, which is very well what might have happened.
A mugshot database is inherently biased because it can only pick prior arrestees, and we all know a disproportionate number of them are Black.
A larger, less biased database such as state DL/ID photos may have returned the correct match. However, police generally can’t use those databases. Yes, they can pull a photo when they run your DL, but they don’t have full access for things like facial recognition.
Huh, normally cops know facial recognition is so utterly worthless and indefensible as evidence that they hide the origin of the “identification” (which makes it even harder for the “suspect” to understand, much less challenge the charges against them). What’s consistent is that they blindly accept the match, don’t compare the two, and miss incredibly obvious evidence that the original image and the person “matched” aren’t the same (in this case, the pregnancy, in other cases, major height differences, obvious tattoos, etc.) That facial recognition works the worst on Black faces just contributes both to the misuse and utter indifference of the cops (and prosecutors, etc.) in correcting the mistake.
But facial recognition software gets trained on white (and male) faces, and has a known tendency to create false matches when used on Black faces. The two people don’t even have to look alike for it to do so. It’s really bad, and it’s known to be that bad.
bail bond companies usually charge around 10% of the bond to cover the full amount of bail ( because who has 100k on hand? ) they then keep that money as a fee regardless of the outcome of the case.
so mostly likely her family paid 10k that they’ll never see again. ( not to mention the cost of hiring a lawyer )
This too, for sure. It’s just even if this particular system doesn’t have this particular flaw, it’s still not a tool for good policing! Unless used in a very very circumspect way.
I found a NYS calculator and the maximum allowable amount here would be $6260. Directionally accurate for sure!
That’s settled science long since.
It’s wild to this Brit that the USA has the carefully enumerated rights, but bail can be way more than the defendant can afford and the bail bond industry exists.
Techniques like face recognition are dangerous not just because of bias in the training data, but because of the “base rate fallacy”.
Suppose for simplicity that your face recognition system is 99% accurate, i.e., innocent people are misidentified as suspects 1% of the time, and suspects are misidentified as innocent people 1% of the time. At first glance this sounds pretty good (and existing face recognition software is nowhere near this accurate). However, if there is one carjacker in a town of 10,000 otherwise honest people and you process everybody’s face, your face recognition system will return 101 matches or so, including 100 completely innocent people and possibly 1 actual criminal.
Now 1:100 is obviously better than 1:10000, but even with the wildly overestimated capability of your system which we assumed for the sake of discussion, it is still quite a distance away from the pinpoint accuracy ascribed to face recognition in movies and TV – and it means a lot of police followup work, plus a lot of potential hassle for certainly at least some of the 100 innocent people.
If you consider using your system to scan travellers’ faces for potential terrorists at the airport entrance, with an average of 80,000 passengers per day or so, the New York-La Guardia SWAT team is going to be pretty busy apprehending 800 “terrorists” every day, virtually all of whom will turn out to be innocent travellers given that the actual rate of terrorists to normal people is a lot less than 1:10000.
bail is in fact more than defendants can afford because of bail bonds companies. judges know those exist, so the bail is set to get to that 10ish percent.
and yeah, it’s completely evil that it doesn’t take into account actual wealth. many people sit in jail for lack of small amounts ( 50 or 100 dollars. ) and of course those are exactly the people who lack the means to up and move to a new jurisdiction to avoid a court date
Driving with an expired license, even more trivial than that. The sort of thing that should be a desk-appearance violation and remediable with some paperwork.
What a shitshow! From a video of a suspect, the software matched on a eight-year-old picture of Woodruff, even though they had access to her current driver’s photo.
I was hoping for an article that said which software it was, but no luck so far.