Surveillance video of a self-driving Tesla causing an 8-car pile-up on the SF Bay Bridge

Originally published at: Surveillance video of a self-driving Tesla causing an 8-car pile-up on the SF Bay Bridge | Boing Boing


Tesla’s record practically everything that they do, so the company will have the definitive information about the state and actions of the car. Whether or not they release that data, well…


Won’t the legal/regulatory investigation process have some say on that?


Yes, you are right! Looks like the NHTSA is on it.


First- fuck self-driving Teslas. As an AI engineer and researcher, I feel comfortable saying they are not safe and those algorithms are not ready for public use.

That said, that video is not very incriminating of the car at all, honestly. Maybe it kinda cut that first guy off, but there was no excuse for everyone else to pile into the back like that. They were not paying attention and following too closely.

There’s a reason any rear-ending is the rear-driver’s fault in insurance claims. If you aren’t following too closely, you will never plow into someone no matter how suddenly or unexpectedly they stop.


Tesla’s CEO Elon Musk announced the feature was now available to all drivers just hours before the crash

This may not be what the fanbois will doubtless explain away as Musk’s 5D chess, but it definitely resembles 3D Jenga.


Don’t worry, it wasn’t using full-service driving, Tesla knows that definitively.


My main reaction to this is that it should be illegal for Tesla or any other car company to decide if their cars have autopilot software activated. Any driver-assist technology should require very explicit government review and approval before a company can activate it


And they should be regulated so that they cannot call it ‘self-driving’ - it is all just highly sophisticated driver assistance. Still needs a driver.


I called Elon a moron and told him to STFU on Twitter and now I’m banned or suspended or something - locked out of my account


I suspect that many of the “self-driving” accidents we are doubtless going to see are going to be like this one. A pack of non-self-driving-cars rear ending each other or the self-driving vehicle as it panics and stops in travel lanes. People are really, really badly equipped to deal with sudden changes in speed on the road, and vastly underestimate how quickly they need to slow down in an emergency. Watch some accident dash cam videos and it’s shocking how many could have been prevented with strong braking as soon as an emergency situation unfolds. But people continue to speed at an object in their path, and often continue to do so after they’ve hit it!



Self-driving cars do not exist yet, and it looks like they probably won’t for a very long time yet. Anything that claims otherwise is dangerous false advertising. Teslas are self-driving in the same way that those two-wheeled skateboard things are “hoverboards” and the algorithms that flag stuff on YouTube are sentient self-aware Artificial Intelligence. We can pretend we’re living in the future that’s taking too long to arrive and decide to lower our standards so words don’t mean things anymore, but that doesn’t make it so.


It would be interesting to poll drivers and see how many of them would be willing to board a “self-flying aircraft” made by Tesla.

Then ask them why they’re apparently comfortable turning on “Full Self-Driving” mode and letting the car do its thing.


I think that the interesting thing about this is how everybody is casting this as a Tesla self-driving caused accident.

Consider: If a manually driven Ford had come to a stop, for whatever reason, the narrative would be “pile up because drivers followed too close and too fast”. Nobody would be painting this as a “Ford crash”.

Fact: if all the following drivers had been maintaining a appropriate following distance for their speed and the conditions this would have been a non-event. The headline wouldn’t exist, and if it did it would be “Car comes to a safe stop on the Freeway”.

Much though I personally would like to eviscerate Elon Musk with a blunt spoon I can find no legitimate reason in this crash to give me an excuse to do it to him or anyone else at Tesla.

1 Like

Fact: People follow too close, and there’s not anything that’s going to stop that.

If self-driving doesn’t handle that, it’s not fit for purpose.


I disagree. I think this particular instance would be framed something like, “erratic driving leads to 8 car pile up.”

Whatever anyone wants to say about safe following distance, and I’m a huge fan, no human driver in their right mind would put on their left blinker in an enclosed high-speed roadway, and brake abruptly in the fast lane, and not be called out as being to blame for the accident, unless a mechanical failure was identified.
IMO it’s totally fair to call out who or whatever was controlling the auto, in this case, the much vaunted “self-driving” mode.


The crash happened just after the Tesla changed lanes and stopped with no traffic in front of it. The first car behind the Tesla had had a clear road ahead before the Tesla stopped right in front of it.


That isn’t universal. Context matters - and it should. The Tesla “sorta cutting someone off” takes following distance out of the hands of the following driver, and that echoes backwards to the other following drivers behind them. Slamming on the brakes after following distance of multiple cars has already been compromised makes this collision 100% the fault of the Tesla.


What surprises me is how little insurers seem to care so far. If they ever decide that Teslas aren’t to be trusted, then that could greatly inconvenience Tesla.