Tesla faces lawsuit over another fatal 'autopilot' crash

Originally published at: https://boingboing.net/2020/04/30/tesla-faces-lawsuit-over-anoth.html


@xeni Comments link on the article is broken.

“Autopilot” is designed to automatically process millions of inputs through a complicated version of “The Trolley Problem”. In this case, it detected that there were a large number of stocks in the other lane.


A fatal [Tesla] Model X crash involving a “dozing driver” is blamed on the car’s autopilot feature

The lead sentence may contain some pretty meaningful information as to where the bulk of the fault lay here.


True - it does say Tesla.


Indeed, this is just lazy clickbait journalism. In an article like this I’d also like to see all that being put into perspective: how many miles Teslas drove on autopilot last year, and what is the number of accidents compared to human drivers driving the same distance?

I’m pretty sure we heard about every single death where a Tesla on autopilot was involved, ignoring the rest of the ~35.000 annual deaths related to motor vehicles in the US.


I’d like to see the Tesla deaths compared to deaths related to other faulty equipment.


Not really. One of the key flaws in the Tesla “autopilot” system is the lack of good driver attentiveness check that other systems have. Also, Tesla’s system is notorious for lacking the ability to discern, and stop for, objects that are static, such as an emergency vehicle parked on the shoulder, or move horizontally across a road, such as a tractor trailer rig crossing a road.


Or even just to other faulty cars.

Can you provide some actual data how the safety of the Tesla autopilot compares to other manufacturer’s systems? I’m really curious how they do compared to Uber, Google, Audi/VW, Daimler, BMW etc., but I didn’t find anything conclusive last time I looked.

And don’t they all say you can’t take your hand from the wheel, so, at least from that perspective the driver is definitely at fault?


The National Transportation Safety Board said in its report that Tesla’s design allows drivers to disengage too easily from the driving task…

the cited causes of the crash demonstrate the need for automakers to improve how well the systems monitor driver attention.

Systems such as GM’s Super Cruise use cameras to make sure the driver is looking at the road. If the driver isn’t engaged, the car delivers an escalating series of warnings before the car pulls itself over. Super Cruise was the highest-rated of four advanced driver-assist systems CR tested last year. In contrast, Autopilot’s hand-on-the-wheel sensors don’t detect whether a driver is actually looking at the road.

“The main flaw in Tesla’s system is that checking to see if a driver’s hand is on the wheel isn’t sufficient,” Fisher said. “It’s about doing enough to make sure the driver is engaged.”


Great, Tesla has joined Uber in the race to see how many pedestrians its autonomous driving software can fail to classify before plowing into them at high speed.


Shocker yet ANOTHER negative Tesla story on Boing Boing without relevant information. Weird!
(look back at “TESLA” Boing Boing History to see the hilariously obvious bias without corrections later.)

(Tesla makes the top 3 safest cars in the world)

Welcome back to BoingBoing, habitual only-complains-about-Tesla-coverage member!


Not to be a complete killjoy, but if autopilot cars are not safe unless the driver’s attention is 100% focused on the road, then what is the point of them? Just adding a zero to the price of the car?


How much does it pay to be a reactionary fan boy and where can I sign up?


Xeni is that you? Sorry to burst your on-going crusade against TESLA but here’s some real statistics and facts (you know Tesla makes the top 3 safest cars by far and is one of the ONLY production cars that detect and stop for pedestrians right? vrs ZERO capability by other brands. Are you a short seller or something or just personal vendetta?)

“The demonstration of Tesla’s emergency braking system in real-life videos comes as a stark reminder of the true purpose of the system - to avoid human errors and enhance safety on the roads.”

“Interestingly, Karpathy mentions that the cars seen in the video were not necessarily in AutoPilot mode when the incidents occurred. As per him, the software continuously monitors the environment around the car and such safety features just swing into action when they see an anomaly.”

1 Like

oh you mean like the 9/1 ratio of negative Tesla stories in the last month here on Boing Boing? Not sure lets ask Xeni. (The one “positive” story was a you tuber making the back to the future video with a Cybertruck not about safety or positive environmental effects.)

Are you… dissappointed?


not surprised.

Tesla has yet to file its response to the lawsuit, for which it received a summons just yesterday. The complaint can be read in full as a 45-page PDF.


Yep read that in the original post, Did we get a correction when Boing Boing posted the “Unintended Acceleration” hoax article that was debunked and started by ONE short-seller? Nope.

From the first sentence, emphasis mine:
“Tesla is being sued by the widow and daughter of a man killed when an allegedly dozing driver let his X’s Autopilot feature steer it into a group of people.”