Tesla on autopilot almost crashes into passing train

They still had a brush with death, so I’m guessing they care more about that.

5 Likes

The more progress we make in AI the more we discover that the problem is FAR more difficult than we had anticipated. We still don’t know all the things that we don’t know how to do. Which indicates that we are nowhere close to science fiction style AI.

The public that matters here is going to be the juries in the wrongful death cases. And I suspect that they will be pretty skeptical of the fine print behind “full self driving.”

9 Likes

Given Elno’s famous hatred of public transport (and trains in particular), I can only assume this is deliberate: Elno is using Tesla drivers as unwitting kamikaze attackers on rail. Sure, he loses a customer, but at the very least a train gets seriously delayed in return.

It drove me (no pun intended) crazy that they set up “self-driving” car tests on public roads this way, and that they blamed the inevitable fatal crashes on the “driver.” It was unconscionable that this was going on even in a limited scope with trained drivers - because the makers of the tests should have known they had created the perfect recipe for a dangerously unattentive driver. That this problem is now being enacted on such a large scale with a consumer product… (Though I suppose it’s just part of Tesla’s whole philosophy of ignoring established research on safety in favor of something Elno thinks is cool, that we also see with the dashboard controls being replaced a touch screen you have to look at and fiddle with to do anything.)

15 Likes

was the “progress” over the last decade not merely just resolution and scale? the basic principle of current “ai” seems to me very much like a dead end for “real” ai and all that talk from the industry about generalized “ai” is either pr-speak or magical thinking. or both.

5 Likes

Long ago.
But they don’t tell you what it’s full of.

5 Likes

In April, Elno pushed out the FSD package to all Teslas for a month. We aren’t sure if it was to try to convince Tesla owners to pony up several thousand to add it to their vehicles, or to gather a shit-ton of data to train their machine intelligence on. We were very relieved when it got turned off May 1, as even in temporary trial mode there were a few 'scuze me? Aaaahhhh!! moments.

9 Likes

Someone I had thought made reasonable TikTok videos described it like driving with a fresh kid just learning to drive. Then, after some use, more like an inexperienced driver, than a better driver. As they began to trust the system more and more. I think they’re just suffering from confirmation bias. The system didn’t keep getting better, it’s still the same driving with a fresh kid. They just developed familiarity with it that nothing bad had happened yet, so clearly it wouldn’t.

This sounds like a recipe or disaster. Having driven with a fresh learning to drive kid, I do not need that much stress every time I need to drive somewhere.

Watching the videos posted, the driver doesn’t appear to hit the breaks at all. They just assumed the car would stop. Then, assumed some more as it got closer. Until at the last possible moment, the driver not the car on it’s own, swerving. Swerving through the train signal gate into a ditch.

How much of a delay are we thinking here? In the Car vs Train encounter, I’ve never heard of the Train loosing.

8 Likes

The Tesla was just trying to solve the trolley problem the only way it could.

7 Likes

That’s much like saying “Our Ice Cream is different than our other feature, “Gelato” in that Gelato, while actually made from steamed broccoli, is edible. Our Ice Cream feature is actually a steaming pile of dogshit.”

6 Likes

You missed:

FSD Fucking Stupid Driver
FSD Fleeced Sucker Dupe

etc.
:slightly_smiling_face:

8 Likes

I am not anywhere’s near in Tesla’s corner, but part of me feels that this is really partly on the driver in the video shown. It’s clearly very poor visibility at the time of the video. The front dash-cam shows that it is extremely foggy, as even the train crossing lights are difficult to see. This seems like reckless weather to be running your car in FSD mode. FSD is clearly beta software and not safe, but this didn’t look like a good time to using it in public, low traffic road or not.

1 Like

I’d say rather its reckless weather for FSD to be operating in, or at least at that speed. If the driver is supposed to be able to cover for situations that the system can’t anticipate then it needs to operate in ways that allow for the limitations of the human element. Either FSD shouldn’t operate at all in those conditions or it should operate within the limits of the driver’s sensory perception, not the car’s sensors.

8 Likes

When a train kills somebody there is a bunch of paperwork that needs to be completed before the engineer can finish his journey.

6 Likes

That’s helpful clarification. The headline gets it wrong and, for anyone who already knows the difference between those terms, would be really confusing.

5 Likes

When I used to regularly take commuter rail, there were frequent delays due to collisions with motor vehicles, as reports had to be made and everything had to be checked out, even though it didn’t do much to the trains, mostly. (Every so often a train would hit something big, like a truck, and partially come off the rails, causing even more delays.) Even when the train wins, the passengers lose. If you want to have people turn away from public transport, you just need to make it less convenient - and if you need to sacrifice a few Tesla owners to convince a trainload of people to drive a car, isn’t it worth it?

(I’m almost half believing this conspiracy theory now, given how morally bankrupt Elno is… only his technical incompetence is the thing that definitely prevents him from doing it.)

8 Likes

If you click into the place asking for videos the replies they seem to think that everything is the (unengaged) driver’s fault. “You reacted too slow to save yourself from FSD” and “You are still responsible for for the car in FSD” are the most common. Muskrat brigadiers are so dumb. And deadly!

5 Likes

From my interest in YouTube videos of crazy car crashes, the USA doesn’t need FSD because they already have that capability.

1 Like

THIS!
Reacting to a sudden situation when in control takes far less time than doing so when not in control but monitoring, no matter how closely. And the monitoring requires, bizarrely, a greater concentration than the muscle memory of being in control.

ETA @wazroth explained it better and in more detail.

7 Likes

Darwin approves of Tesla autopilot.

More importantly, the train driver might be completely traumatised by the accident and may never be able to return to their job.

6 Likes