Originally published at: https://boingboing.net/2019/07/06/flickering-car-ghosts.html
…
I knew I should have taken that left turn at Alba’); DROP TABLE Destinations ;–
a 100ms instant – too short for human perception
Bull. I can perceive a 100ms event.
Is the intrusive dance music a mechanism for ensuring the total distraction of the human driver? Did the robots compose this music
Signs with equal spacing between them : Anteloper Frwy, 1 mi. …Antelope Frwy, 1/2 mi. … Antelope Frwy 1/4 mi. … Antelope Frwy, 1/8 mi. … Antelope Frwy 1/16 mi. … Antelope Frwy, 1/32 mi. … etc. “Ralph, Ralph! I’ll take it!!!”
Depends on the event.
A ñuclear reaction gone critical? Sure.
A flick on a roadsign when you were looking at something else? Not so much.
Yeah I am pretty sure I have struck a problem like this with my eyes. I was crossing a road and as I swept my head from left to right I caught a snatch of a traffic signal in my peripheral vision and it looked green. I looked again and it was red. My interpretation is that the LED signals do a brief test of the LEDs once a second or so, just to make sure they are still working. Its too brief to see normally but if the light moves into your field of vision at just the right moment you might see the test cycle.
Diversionary/disruptive road signs… but hi-tech.
This has the makings for Roadrunner cartoon reboots.
There are lots of places where the signs are simply wrong, too. In my downtown area, there’s a 25 mph speed limit, but there’s also a caution sign recommending a speed of 30 mph for a curve in that 25 mph zone. And one by work where the sign says 40 mph, but the speed painted on the roadway immediately adjacent to the sign says 45 mph.
Do not ascribe to malice what can be explained by incompetence.
from unexpected swerves to sudden speed-changes to detours into unsafe territory
When a (real or fake) road sign does any of that to a car driving AI, it still needs a lot of work before being released into the wild.
There’s still a bit of churn before self-driving cars are ready to be released. It always seems to be just a little further out than we were promised.
I’ve had similar things happen, but I think it is more of a function of how your eyes work.
https://www.cis.rit.edu/people/faculty/montag/vandplite/pages/chap_9/ch9p1.html
During a high speed movement of the eye or head I think the data you’re “seeing” is low quality and possibly not in color. Of course your brain fills in everything it thinks it can and based on spatial positioning you know which color the light should be if the top or bottom is lit. I have something akin to nystagmus and could easily see a 100ms event if I was looking for it (LED tail lights bug the ish out of me). But the issue here is the AI not seeing that an illuminated sign is being projected on a building. Where I live illuminated signed aren’t a thing except maybe in a work zone. More importantly all this information is secondary to the driving of the car. Speed limit signs are vaguely important, a one way sign only slightly more, I or an AI should be able to understand from the environment what is going on and if safe conditions are being met. 90 kph in the city, yeah that’s a hard no.
Other commenters have noted that a sufficiently well designed AI should drive safely regardless of what signs tell it. However, the attacker might have other intentions. Consider;
- corrupt cops flash up bogus traffic restrictions, then ticket you for following them
- 0.1%ers flash no parking signs to save their favored spot, or diversions to clear traffic from their route
- crims redirect you to a suitable spot for carjacking/extortion/stealing cargo/etc
- delivery firms sabotage their competitors by delaying deliveries
- signs as an injection vector for some further exploit
- something I didn’t think of in these few minutes
There’s more possibilities here than inducing cars to crash
reminds me of the fulgurator[1] - a gutted SLR camera fitted with a flash set to be triggered by other flashes going off. The image on the slide in the fulgurator is then projected into whatever it’s focussed on and that image ends up in the photo taken by the unsuspecting victim
[1] http://juliusvonbismarck.com/bank/index.php/projects/image-fulgurator/2/
Auto-vehicles are only the start. When we’re all linked by neural implants to a central AI, where will we end up, even without malefactors with tachistoscopes flashing us?
Oh good grief. It was not an “autonomous vehicle” despite what the Boing Boing headline says. It was not even an autopilot. It was a “driver assistance system” – it flashes info on a heads-up display for the driver to read.
Yes, the system was merely a helper. That’s a start. But when will we be “…free of our labors
and joined back to nature,
returned to our mammal
brothers and sisters,
and all watched over
by machines of loving grace” ??
This topic was automatically closed after 5 days. New replies are no longer allowed.