That tracks.
Many years ago I was made aware of the “false positive” problem [1] that applies to things like facial recognition software: if the particular people you’re looking for are rarer than the software’s false positive rate, then you’ve got a less-than-even chance of having the right person when the software says “identified”.
Example: in a city where the cameras scan 50,000 people a day, you’re scanning for 100 suspects in particular, who you have sound reasons to believe are in the crowd somewhere.
Facial recognition software often brags about “99.5% accuracy” [2]. Let’s take that at its word (there’s no reason to, but for this example we will).
50,000 people scanned, 0.5% error rate = 250 mistakes.
100 suspects. Let’s round 99.5% up, so all of them are detected.
Given that detection has occurred, there is a (100 / (100 + 250)) = about 29% chance you’ve got the right person.
99.5% might be too good to be true, btw. And 50,000 is a small number for this kind of work. So it’s probably worse than that.
The points @PsiPhiGrrrl has raised illustrate this: whether LEOs say “let’s be careful not to persecute innocent people” or “round 'em up and let the courts sort it out” depends on whether the suspect’s skin is lighter or darker than a hamburger bun.
ETA:
[1] The “false positive problem” was first explained to me by my probability prof, whose distaste for authoritarians was personal. In 1940 when he was a baby, his family fled from France to Romania. As a “numbers professional”, he absolutely hated people using numbers like “99.5% accurate” and not mentioning what this means in practice when you’re looking for a rare thing. And as someone whose family fled a persecution nightmare, he hated the agendas behind that number-juggling.
[2] The quoted accuracy rates vary from 99.95% under “ideal conditions” - identical lighting and background to the reference image - to less than 85% when you complicate the problem with e.g. different background, uneven lighting, partially obscured by umbrellas, hats, other faces, etc.