Tesla 'assertive' mode is too aggressive, 53,822 cars recalled

Yes, it’s an update.

1 Like

In the UK, most (non-traffic light) road junctions have “Give Way” signs, ie, you slow down, but only have to stop if there’s something coming.
“Stop” junctions are getting pretty rare, and usually they’re used where part of the junction is blind, or dangerous in some way, so you can’t see what’s coming until you’re stopped.

I can imagine that if every junction said “Stop” on it, then people would just start doing rolling stops everywhere, up until they reach a dangerous junction and get hit.

3 Likes

What?

hmm, let me think how to explain this.
In the UK, it’s not illegal to just roll through a ‘Give Way’ junction as long as it’s safe. So, when you come up to a junction with a big red STOP sign, you know it’s there for a reason, so you (probably) come to a complete halt and have a good look for dangers before you pull out.
If every junction in the UK had a STOP sign, people would realise that they didn’t have to completely halt for most junctions, and would start ignoring the rules and just slowing down for STOP signs, rather than coming to a complete halt. Then, when they reach a dangerous junction, they don’t stop, and instead just roll into the road and get crashed into by a vehicle they’d not seen because it was a blind corner or something.
Is that better explained?

2 Likes

The software push happens overnight usually. You turn on the car in the morning and it let’s you know there are new updates installed.

5 Likes

If I wanted a rectangular steering wheel then I would drive an old Austin Allegro.

8 Likes

This all gets even more monstrous if you understand how these algorithms work and why there is an “assertive mode” at all.

First- I spent 20 years working in AI, with an emphasis on autonomous navigation systems for complex environments. I know what I’m talking about here.

What you have to understand (and that the general public does not) is that self-driving consists of a couple of dozen of what we call Hard Problems in computer science. These are problems for which the solution can’t really be calculated, only approximated. The whole system is probabilities. It’s how all this “machine learning” stuff you hear about now works. Probabilities. Siri cannot identify the phrase “what time is it”. What it can do is take a long string of noise and map that to a series of likelihoods of that noise being “what time is it” or “play Weezer” or “call mom”. That’s all “machine learning” is. It is so much more primitive than the lay public thinks it is.

None of this is ever EVER guaranteed. The car does not know there is a stop sign there. It knows there is an 82% chance that this frame of video contains a stop sign and a 71% chance that it is in our lane and an 89% chance it is close enough that we should consider stopping for it.

This is where “assertive” mode comes in. Because the algorithms are never sure about anything (and to be clear, never will be) you need “fudge factors”. How sure is sure enough that it’s a stop sign and we should stop? If you set it 100%, the car will run every single stop sign. If you set it to 85%, the car will stop randomly in traffic at red birds and polygonal logos on signs. There’s no answer that works in every situation so they are beta testing these sets of fudge factors to see which works best. “Assertive” mode is one of those. No fudge factor will always work. It’s about picking where you want your errors to land- do you want the car to fail by occasionally stopping for nothing (which you see the Google and DARPA test vehicles do all the time) or do you want it to run the occasional stop sign. Both are dangerous in different ways. This technology is not safe.

All of this should, of course, terrify you. These cars should not be on the road, and no ethical AI engineer would say otherwise. You know all those times Siri misunderstood you? With a self-driving car, someone just died. They are the same algorithms.

The other thing about self driving is that it’s a classic 80/20 problem in computer science. 80% of the work is getting the last 20% right. People think because we’ve seen a few controlled demos that self driving cars are imminent. They are not. We will not see full autonomous vehicles on open public roads in our lifetimes. That is way way further away than people think. They see the 80%. We’re “almost there”. No, the last 20% (the part that keeps the cars from killing people) is still decades of work.

This is a frustrating topic for me because the mainstream press gets every single detail about this technology wrong and the general public are way too optimistic about this type of technology.

22 Likes

This. I can fault Tesla for a lot of stuff, but what is unsafe about being the only car at a stop sign? I don’t always stop at stop signs. I drive a manual and stopping is just more wear on an already expensive clutch design. If the Tesla is capable (and I’m not saying it is) of seeing what is going on then I have little issues with it slowing to 5 MPH and rolling the stop sign.

In 50 years I expect cars to be traveling at high rates of speed with only inches between them. I also expect windows and windshields to not be the norm. I think a lot of people would be very uncomfortable doing 90 MPH with cars physically close enough to touch on all sides of them.

To be fair to my other comment, I agree with you. I hope at some point we envision driving in a more connected way. If the car knows more concrete specifics about it’s environment then these calculations become less assumption and more fact. If the stop sign transmits that it is there and the vehicles are interconnected, then we have a level of reasonable facts that allow a much better flow of traffic. Having each individual vehicle guess what the other driver or AI system is going to do isn’t much better than what we have now. (And to your point take much, much longer time to become reliable and safer than a person in all conditions.)

Somebody said that developing self-driving cars always ends up with reinventing trains.

7 Likes

ETA: The junction has been changed so it is staggered, it cost £500,000 to do and work was completed last week.

https://www.hants.gov.uk/News/25012022Ipleycrossroadsopening

9 Likes

read what @VeronicaConnor wrote just above.

it’s not about the safety of being the only car, it’s about the safety of knowing you’re the only car. not to mention correctly identifying pedestrians who might be about to cross, etc.

the ai and the cameras aren’t going to be able to do that reliably anytime soon

8 Likes

How can you know for sure that you are the only car? That was my point, stop signs in the UK mark junctions where you cannot see what is on the road until you are right at the junction. (for example)
In the US you don’t have that extra warning, so I guess you have to be extra careful the whole time?

4 Likes

Who cares how convenient it was for the owners to have the problem fixed? The issue is Tesla programming autopilot to violate the law and drive in a an unsafe way that puts even more pedestrian s at risk. That is fucked up.

12 Likes

Cars that have such a major technical issue that can get people killed should not be on the road. This is not difficult. On the streets with other drivers, cyclists, and pedestrians is no place for fucking beta testing…

11 Likes

This has been addressed a few times already in the thread.

Welcome to BoingBoing

8 Likes

You clicked, though.

Thanks for playing!!!

11 Likes

Clearly, we’ve all fallen down on our duties to dear leader Musk, and need a reminder to better defer and worship our betters… /s

10 Likes

Oh no! Legacy manufacturers who work for the government!

And don’t update their cars software over the air either!

Try harder next time.

8 Likes

Considering they delivered a car that was not only missing a brake pad (federally required safety equipment!) but dragged their feet in getting it replaced?

And that Tesla’s build quality is absolute crap for the amount of money that is spent on the things? And their post-sales service makes fucking cable and telephone companies look like saints?

No tesla for me, thank you.

Or a reproduction of KITT from Knight Rider. (I’ve heard that the steering on the car from that series was bad as well.)

the EITMLI5 version: A Recall is the process name given when an office of the federal government requires the manufacturer to correct a defect in their product that is unsafe and/or has caused injury or death to people. For most cars, yes, that requires a visit to the dealership to either replace a componant, or reprogram the firmware on one of the computers in the car. Since Tesla can push the updated firmware in their cars remotely, that’s the process for that company.

4 Likes