Welsh police deployed facial recognition tech with a 92% false positive rate, but they're sure it's fine

Originally published at: https://boingboing.net/2018/05/08/cachu-hwch.html

2 Likes

no members of the public have complained.

…for long.

2 Likes

To be honest, sifting through footage of a crowd at a football match (looking for known hooligans), with further manual screening of the images before anything is actually done, is a sensible use for this sort of tech as long as the false negative rate is low enough. It just means that they only have to look at a few hundred photos not tens of thousands.

1 Like

Whether this is as ridiculous as implied kina depends on how they were using it. If they need to look at 200,000 people, and the software whittles it down to 2,000, and 200 of those are people they were actually looking for, that’s a 90% false positive rate, but the software is doing a useful job. In that scenario, the relevant figure is the rate of false negatives.

In any case, a system with a 92% false positive rate is not particularly worrying because it can’t be used without human review. It’s software that’s right 99.9% of the time you need to worry about, because that’s the kind of software where they’ll let it convict you all on its own, which is bad news for the 0.1% of people it’s wrong about.

7 Likes

tech with a 92% false positive rate

Good enough for Government work.

7 Likes

These guys are the Fox News of law enforcement facial recognition. 10% right, 90% garbage.

1 Like

The false positive rate here is not 92%

If you had a cancer detector such that, if you tested 100 people who did not have had cancer, it says “NO CANCER” to 99 of them, and “CANCER” in error to 1 of them, you would call that a false positive rate of 1%.

And if when you tested 100 people who did have cancer and it said “CANCER” to 99 of them and “NO CANCER” to 1 of them, you would call that a false negative rate of 1%.

If you then tested a population of 10000 people, 9900 of which did not have cancer and 100 of which did, you would expect there to be 99 false positives (1% of 9900) and 99 real positives (99% of 100).

What you have done is the equivalent of taking these stats - the output of a system with a 1% false positive rate and 1% false negative rate - and saying “Oh, there are 99 false positives and 99 real positives, so the false positive rate is 50%”

This is a statistical error so stupid and so common it has its own special name:

14 Likes

Either the fraction of criminals in a Welsh crowd is incredibly high, or that 92% figure is nearly too good to be believed.

Assume that one person in a crowd of 10000 is a dangerous criminal and that the software is 99% accurate. The most likely outcome is that the one criminal will be identified, along with 99 innocent people tho perhaps look a bit like him. The 99% accuracy has turned into a 99% false positive rate because the base rate is so low.

The only way I can see face recognition at today’s level of technology getting the false positive rate down into this range is if the base rate is much, much higher - as if they’re screening for “this person at some point in the last few years was stopped for a traffic violation,” and then claiming success if the person that the software flags has a traffic violation on the record, even if it’s the wrong individual.

I suppose the base rate could be that high if Wales practises mass criminalisation at near-US levels. Otherwise, I’m really sceptical about that 92% figure. It’s implausibly good.

2 Likes

1 Like

Yeah, this system seems to be working as a kind of pre-filter for the human. We should be interested in the output of the whole system, not of individual components.

I think that this is another case where, once you go to a machine, you get numbers where none existed before. If it had been a single human, what’s the corresponding thing? A glance and a thought of “who is that? oh never mind” No one records things like that, but it’s going to have a high false positive rate.

Were they searching for specific people or just anyone with a warrant? The question is, if you collected 2000 photos at random, what would the success rate be that some of them would have a warrant?

This was my thought exactly. The false positive rate needs to stay high enough that the human beings doing the screening feel like they are doing the work and not rubber stamping the machine. I worry this would kick in at accuracy rates way worse than 99.9%.

Honestly I originally misread the headline and thought it said that the software was only 92% accurate (8% false positive) and that seemed very worrying to me.

1 Like

In fairness, that’s a maximally difficult data set to draw distinct faces from…

4 Likes

Humans are ridiculously bad at reasoning about conditional probabilities, even when their jobs depend on it.

1 Like

"Taffy was a Welshman,
Taffy was a thief…"

Round 'em all up!

2 Likes

This is an inbreeding joke huh? Low hanging fruit, dude! And three assholes liked it, go figure. You’re making Ein angry, don’t make Ein angry.

1 Like

In my defense, I’m half Welsh, so I can make/like jokes if my mother doesn’t hear about it.

edit: Which half is the asshole is left as an exercise for the reader. :wink:

4 Likes

they discovered that the system generated 2,470 alerts, 2,297 of which were faulty. …

The South Wales Police say they arrested “over 450” people at the event thanks to facial recognition technology, but none were the result of false positives.

How can both of these statements be true? If there were only 173 alerts that were not false positives how can there be over 450 arrests made based on true-positive alerts?

1 Like

According to the article, the 450 figure was over the last nine months, not for this event. Cory misinterpreted.

2 Likes

Da iawn. I suppose you get a pass and won’t make Ein angry.

2 Likes