AI surveillance cameras to fine British "litter louts"

Originally published at: AI surveillance cameras to fine British "litter louts" | Boing Boing


So instead of paying a human to spend two minutes checking the footage for a £90 fine, they’re going to let an algorithm make a legal decision on fine and appeal, the same algorithms that I’m required to prove I’m not by identifying a traffic signal.

Algorithmic moderation has roundly been a disaster on social media. Algorithmic biometrics routinely return false positives and false negatives. Algorithmic chatbots are easier to radicalize than a conspiracy theorist. Algorithmic evidence is routinely successfully challenged in courts despite the legal system’s obsequious deference to dubiously credentialed “experts” prosecutors hire to market them.

The only way putting a human in the decision loop would render the system unprofitable is if it returns so many false positives as to be wrong more than it’s right.

Technologically illiterate lawmakers may not know any better, even though they should. (Would we want financially illiterate lawmakers making fiscal and monetary policy?) But the tech industry executives pushing this sort of technology are knowingly subverting justice for profit. There’s a special place in hell for these amoral scum-sucking contractors.



Ladies and gentlemen! I give you…



Dammit. I discarded that idea first and riffed on it, but it really works perfectly doesn’t it?


It really works on a couple of levels. Thank you raising the point.

Edit to add appropriate subtitle from the article title, " AI surveillance cameras to fine British “litter louts”."


We have a security camera set up at our semi-vacant lot where we are prepping for a house build. Right now it’s mostly for my wife to make sure I don’t drop a tree on myself. However, it is pretty good at detecting cars coming on to the property and sending us a little video. We recently put hay bales across the drive to deter the fairly frequent stops people made. So yesterday a dude drove onto the driveway, just up to the hay bales, got out, unzipped, looked dead-on into the camera, then did a quarter turn and peed on the driveway then left.

We also see a couple of bobcats who like to come visit at night, and deer.

anyway as for the topic at hand: Just pay someone to pick up the litter, or stop propping up the disposable container industry, ya fuckin’ gits.


Sounds like an automatic high pressure water cannon would be more effective. Put it on YouTube and it could be just as profitable.


Since the vast, vast majority of the footage is of people not littering it totally makes sense to use AI to scan for evidence of the violation rather than make humans do the drudge work. The littercam link explicitly says the flagged incident is then checked by a human:

"Our platform then securely transmits this footage for human validation, before an automatic request is made to the DVLA for details of a vehicle’s registered keeper so a penalty charge notice can then be issued. "


Yes, a human in the decision loop is what I was talking about. This somewhat ambiguous quote made it sound fully automated.

Our end to end solution includes all hardware, software and services – from detection device to the point where a debtor makes an appeal.

Still creepy and arguably immoral, but thanks for the correction.


Walks in


Walks out


Makes a nice change from the litter cops following smokers around in case they drop a butt and they can film it on their body cams.
Actually I wouldn’t have a problem with it if they were targeting flytippers but that is frequently a rural crime that farmers don’t bother to report (with the result that it never becomes a priority for the police).

While it is good the system has a human involved if the aren’t sending a roughly equal number of people in cars not littering images for verification expecting a ‘no’ response, along with the people in cars the AI thinks are littering, then change blindness will set in and the reviewers will just click ‘yes’ on every image without really examining them.

Definitely a possibility. I have that problem with my web security program because there are a bunch of connections I haven’t white listed and have to manually approve, making my manual approvals automatic and perfunctory. I really should white list them because the computer will do a more accurate check of the URLs/IP addresses than I will.

Once answer could be that the AI system should forward an equal or greater number of non-littering incidents so the the human evaluator has to evaluate and not just automatically hit the “approve” button every time.

I don’t know if making the process 100% manual would actually solve the problem of over charging. The AI filtering, if accurate, could actually reduce over charging over 100% manual examination of the raw footage. It would take one or more good studies examining the process and outcomes to determine. Our speculation is just that at the moment.

Unfortunately though it seems like a stupidly high false positive rate is merely a feature and not a bug for UK police when they insist on still using it.

1 Like

Enlightened English word wranglers are the aces of alliteration.

This topic was automatically closed after 5 days. New replies are no longer allowed.