Watch: A driver trusted his Tesla autopilot — a little too much

At least no-one’s Bambi-Mother died.

1 Like

listen i wasn’t even there when that happened.

1 Like

A certain Mr. T. Humper has ratted you out, ya dirty rat…

1 Like

Hopefully, assuming it wasn’t a stunt, this driver will be more careful in the future. As these things go, this was a pretty mild lesson (as opposed to running down a ped or crashing into a police car)!

1 Like

The extent to which people will trust a Level 2 system continues to alarm me. Yeah, Tesla misleadingly calls it “Autopilot,” and it will make turns/accelerate/brake. But you have to be ready to take over at a moment’s notice, because it sometimes gets it very, VERY wrong.

And that brings to mind another point: I don’t recall signing up to be part of the public beta for this system. I’d like to see more development with a LOT more oversight before the rest of us have to share the road with this tech.

12 Likes

Wow wrong reaction…I’d be like “OMG I just almost died but didn’t, hooray”

1 Like

I cannot understand why “self-driving cars” are allowed on the public roadways. I feel less safe every time I see a Tesla.

I’m no luddite. I made a career in computer automation and CNC controllers and such. I like this computer stuff. A lot.

Sadly, @Vernon_Zimmer is probably correct. Doing it for the clicks and likes. At least keep it safe for everyone else, “influencers”.

4 Likes

I would be very skittish about any traffic condition that was “out of the ordinary”: how do you know the Tesla programmers have accounted for that condition. Those orange cubes, irregularly placed… looked problematic.

And, has Telsa even solved the Trolley Problem yet?

2 Likes

They will if that shuts the guy up and he takes down his content,

1 Like

Even in the late 23rd century, Sulu never left the helm.

8 Likes

Yeah, let’s be clear on what an “autopilot” is. In a plane, it helps maintain level flight in more-or-less ideal conditions. But if – for example – the plane senses that flight parameters are less than ideal, it’s going to tell the pilot, “I don’t know what to do, you take over.”

In an aircraft, with a few seconds worth of warning, this can be fine. But aircraft disasters occur when the pilots are distracted and don’t notice that the autopilot is disengaging, or don’t know what to do once it does (Aeroflot Flight 593, Eastern Air Lines Flight 401, & Air France Flight 447 are a few examples).

Driving a car and not paying attention because you have driver assistance features DOES NOT WORK – because when the “autopilot” doesn’t know what to do, the car’s meaty human filling has mere seconds to get back into a headspace where they’re paying attention, taking the wheel, and executing split-second decisions to ensure the safety of them and those around them.

Until AI drivers are PERFECT, this is NOT going to work.

6 Likes

I agree with all that except for one thing- the situation you’re describing is the “ideal” case for level-2 systems. In reality the driver often doesn’t even have multiple seconds at all.

5 Likes

Totally agree with your revision – meant to say “fractions of”. There simply isn’t time for the human to become aware and assess the situation before they needed to take action already.

4 Likes

The entire system and thoughts of Tesla is designed to protect the integrity of their autopilot. In all situations it must be human error that caused the accident. In this case it’s because the human driver wasn’t paying attention. If the car was to hit the pedestrian it would be their wild actions (this has happened with a self driving car and everyone went out of their fucking minds trying to explain why the person deserved to be hit (and they weren’t even on a bicycle(!)) While self driving has the potential to make driving safer in the future, the need to defend and protect the idea of a safe autopilot over the health of driver, pedestrian, pet, etc… creates entirely new and terrifying issues.

4 Likes

14 Likes

Also-- it’s really only going to work when all cars are hooked up, allowing all cars to intercommunicate and negotiate speed vectors to maintain safe distances.

But that scenario (all cars controlled by higher AI heuristic) is such a scary idea to the Free-Dumb coalition that it might not be a marketable concept.

I mean, right now, we have a higher AI heuristic that listens to all our phonecalls theoretically. Its called a cell phone tower. But I think, the analogy for cars would scare away the masses.

1 Like

I might get banned for voicing the wrong opinion, but I just love my Tesla autopilot. Yes, I still have to pay attention, but driving long distance in the dark is just so much easier. I don’t get tired, I can concentrate better, and I feel safer.

Yes, there are doofuses (doofi?) that trust it too much. But then there are BMW drivers too…

Ummmm, no. That’s not what gets people banned around here.
Consider giving the Community Guidelines a read.

9 Likes

Yeah, they are replacing trolleys with HYPER ZOOM TUBES; which contain no switches. Problem solved.

3 Likes

And yet, daylight, clear weather, high-contrast, high visibility traffic control devices…this was almost as easy as it gets.

6 Likes