Tesla eerily silent as federal investigators blame “FSD” in deadly firetruck crash

Originally published at: Tesla eerily silent as federal investigators blame "FSD" in deadly firetruck crash | Boing Boing


There’s probably at least one person on the job but they’d be limited to the word “bunker” whether noun or verb.

Before responding they’d at least have to wait to see if the surviving vehicle occupant can be purchased after the fact.


I’ll just leave this here.


Probably lost in the legalistic weeds, but have the tesla folks ever officially sold their cars with a full self-driving (‘FSD’) mode, as opposed to hiding under some notion of partial/assisted self-driving (uh… PASD?) mode? — hat-tip to @jlw clearing that up in nothing flat ([feckless head-shake at musk-land])

Always amazed that any such officially untested option was tacitly deemed safely legal on public road-ways. (“Well once you allow them to park themselves…”)


So which (probably former or soon-to-be former) employee will Elon say gave him faulty information about Tesla’s self-driving capability? That’s his pattern, right?


Or maybe they’ll offer the surviving occupant a horse. Which is ironic, as some horses have really good autonomous steering.


I wonder how the FDA would respond to a company that marketed “Fully Self Healing” scalpels?


My Mazda has some sort of radar that spooked the shit out of me when it auto-braked one time as I was behind a bus. I don’t know what mode i had engaged but it hasn’t happened since … guess I could read the vehicle manual… However - the Tesla owners are dying, people on the streets are dying. Musk makes fun of the situation and the Feds say & see & do nothing.

It’s like a Tesla car is a theoretical concept and the Feds don’t have one to test*, are letting all the bugs get revealed by the public.

  • unless the idea of driving a Tesla with all these modes engaged scares the shit out of the testers and they don’t want any part of this.

I drove by that accident while they were still cleaning it up a couple of hours later. Very ugly indeed. It’s going to be a long time before I trust self-driving anything, but the guy had to be paying zero attention at all with this crash – there’s an absolutely unobstructed view of where the fire truck was (just on the far side of the overpass, in the left lane) from at least 4/10th of a mile - and the fire truck was already there with its lights on pre-dawn.


I wonder how much data remains from that accident- and if it was altered.


Fucking Stupid Design?


Full self demolition


I’m pretty sure that as long as there were Tweets describing them as “beta” FSH scalpels, the company would be covered. /s, I think.


Nah, if they were investigational devices, the study would be shut down after a pre-set threshold of major adverse events. Like if the scalpel crashed into a parked fire truck…


About 10 years ago, I was on the side of:

  • In a world without FSD, Americans are dying in or under cars at a rate of about 100 / day, every damn day.
  • FSD doesn’t have to be perfect - it has to be good enough to kill fewer than 100 people per day if we all drove it.

Since then, I’ve had some peeks into the personalities involved, and I wouldn’t trust them to run a lemonade stand without poisoning the whole neighbourhood. And I sure as hell don’t trust them to be transparent about the safety numbers.


What are you talking about? They know that FSD doesn’t kill anyone, it’s near perfect. It’s almost always off when a collision happens.


… after his master class on ADA compliance maybe his lawyers finally convinced him STFU


maybe that’s why the driver felt safe to leave it in full self driving. maybe they even believed the hype that it can be safer than driving manually. it’s impossible to know because the driver is dead

what we do know is that tesla’s software has caused multiple collisions with emergency vehicles just like this one. we also know it behaves poorly and seems to have contributed to collisions and deaths in other situations too

the government ( and tesla! ) should eliminate the variable by getting rid of fsd until it can be proved to be safe on the roads ( narrator’s voice: it can’t be proven safe, because it isn’t safe )

it’s entirely reasonable to blame musk, tesla, and their marketing. ditto the government for letting this drag on and causing more needless deaths


Are the collisions with stopped emergency vehicles just collisions with stopped vehicles ‘just outside’ the flow of traffic on shoulders/breakdown lanes, and those happen to be emergency responder vehicles? Are there cases where Teslas are crashing into non-emergency vehicles stopped in breakdown lanes/on shoulders?

I guess I’m curious as to what is really precipitating these crashes. For all I know, the vehicle size/color/flashing lights could dazzle the camera systems (since Tesla only uses cameras, not Camera+Lidar+Radar.)

(Okay, the assumption is that FSD is crashing into things it shouldn’t crash into. I just haven’t heard of it crashing into things that weren’t emergency vehicles, or entirely stationary road infrastructure like the one that hit the ramp divider head on at highway speed.)

1 Like