Stet, a gorgeous, intricate, tiny story of sociopathic automotive vehicles

Originally published at: https://boingboing.net/2018/10/17/autonomous-vehicular-sociopath.html

4 Likes

Art follows life. But it’s the sociopathic people who who should be held responsible.

"One day in 2011, a Google executive named Isaac Taylor learned that, while he was on paternity leave, Levandowski had modified the cars’ software so that he could take them on otherwise forbidden routes. A Google executive recalls witnessing Taylor and Levandowski shouting at each other. Levandowski told Taylor that the only way to show him why his approach was necessary was to take a ride together. The men, both still furious, jumped into a self-driving Prius and headed off.

The car went onto a freeway, where it travelled past an on-ramp. According to people with knowledge of events that day, the Prius accidentally boxed in another vehicle, a Camry. A human driver could easily have handled the situation by slowing down and letting the Camry merge into traffic, but Google’s software wasn’t prepared for this scenario. The cars continued speeding down the freeway side by side. The Camry’s driver jerked his car onto the right shoulder. Then, apparently trying to avoid a guardrail, he veered to the left; the Camry pinwheeled across the freeway and into the median. Levandowski, who was acting as the safety driver, swerved hard to avoid colliding with the Camry, causing Taylor to injure his spine so severely that he eventually required multiple surgeries.

The Prius regained control and turned a corner on the freeway, leaving the Camry behind. Levandowski and Taylor didn’t know how badly damaged the Camry was. They didn’t go back to check on the other driver or to see if anyone else had been hurt. Neither they nor other Google executives made inquiries with the authorities. The police were not informed that a self-driving algorithm had contributed to the accident."

8 Likes

#teampecker

Thanks for the post, nice little story.

1 Like

I now have all the sads. That was absolutely brilliant storytelling.

4 Likes

This story happened to show in my feed right before this BB post and I confused the two.

https://www.tor.com/2018/10/17/ai-and-the-trolley-problem-pat-cadigan/

I thought you were just writing a pre-quel until I read the article, holy fucksticks!

1 Like

I particularly appreciate that the author made the car a Toyota. Toyota cars have already killed people due to ineptitude. The story of what happened with Toyota’s “unintended acceleration” lawsuit should really be more widely known and feared. At the end of it all, after noting that Toyota’s engineers didn’t even have access to a bug-tracking system, after noting that they did not have access to static analysis tools, after noting that out of some 30+ coding practices which were either ‘suggested’ or ‘recommended’ by the automotive industry only 4 were actually followed in Toyota’s code… the judge threw up his hands and said that because no standards exist, Toyota could not be said to have violated any and thus their executives could not be found guilty for criminal negligence. The judge found Toyota guilty of causing the wrongful deaths on the civil matter, but before he could award damages Toyota settled out of court with the families. So no punitive damages were levied, allowing Toyota to simply factor in the need to settle with a few people here and there without needing to do anything expensive like give their engineers access to better tooling, or, god forbid, let the engineers make the call on when the product is ready to ship instead of the MBAs.

When the first fully autonomous car ships, it will be from the company who rushed it the most.

2 Likes

I can hardly believe only half a dozen people came to comment. This was a truly wonderful - and powerfully scary - thing.
The author’s application of the technology issues to autonomous vehicles / a traffic accident is powerful but incidental; the cumulative effect of AI and its data access on all manner of human activities and experiences, and the potential implications of this, are what is truly terrifying.

2 Likes

This topic was automatically closed after 5 days. New replies are no longer allowed.