Watch: Self-driving Teslas run over child-sized dummies "over and over again," according to safety advocacy ad

Originally published at: Watch: Self-driving Teslas run over child-sized dummies "over and over again," according to safety advocacy ad | Boing Boing

5 Likes

Viable lawsuits which could have been prevented; they are in Tesla’s future.

28 Likes

I swear I’ve seen this before. Maybe a few months ago there was a version of this?

Also: Daaaamn, they’re going for the throat! Get it!

10 Likes

Maybe the autopilot is just smart enough to tell the difference between a child and a mannequin? /s

12 Likes

Allowing the vehicle to plow headfirst into stationary objects, causing thousands of dollars in damage, is a feature-not-a-bug?

11 Likes

Probably… (note my /s)

6 Likes

I have to admit that even in this image-laden weary media landscape, I gasped at the Tesla hitting dummies over and over again. Sheesh.

7 Likes

I can’t help but be skeptical here. If the cars couldn’t detect obstructions that size it’s hard to fathom how they’d function at all. This is just knee jerk skepticism but my instincts feel something’s off. Also I was hoping the video was of one Telsa maliciously running then backing over the same mannequin repeatedly. Oh well

5 Likes

They don’t! That’s rather the point. The systems are wildly unfit for purpose. The number of cases of cars with “autopilot” turned on running into stationary objects is quite high…

16 Likes

O’Dowd has drawn accusations that he is little more than a competitor to Tesla because his company bills itself as an expert in making particular software used in automated driving systems. O’Dowd insists his Green Hills software doesn’t compete with Tesla, saying it doesn’t make self-driving cars. But he has acknowledged some car companies use his company’s software for certain components.

Here is a demonstration of very real ongoing safety concerns about Tesla’s system, but the ad just happens be funded by a guy who stands to make a lot of money if the Tesla system is regulated out of existence. The phrase “He has acknowledged some car companies use his company’s software for certain components.” strongly suggests that if O’Dowd’s company didn’t market competing software Tesla’s autopilot problems he wouldn’t have bothered him as much. I’m not saying this is so, cuz I obviously don’t know. What I’m saying is that in today’s freemarket paradise even the saving of lives is inseparable from commercial self-interest. There are real groups doing real advocacy concerning real issues. There are also plenty of advocacy groups formed by commercial concerns piggybacking on real issues in order to boost the ol’ bottom line.

Hear we have a very serious issue: the deaths of pedestrians and motorists due to imperfect, largely unregulated, technology. But Elon deals with the problem by constantly redefining “autonomous” to assure us there is no problem while O’Dowd says “The Tesla system will kill you but if Tesla were running good software (which I happen to sell, just sayin’) this wouldn’t be happening. Just sayin’.”

11 Likes

Level 5 autonomy, suuuuuure…

•Musk promised a cross-country drive by the end of 2017 - it still hasn’t happened

•Musk claimed in April 2019 it would have one million self-driving taxis on the road by the end of 2020 - it still hasn’t happened, and the Los Angeles Times noted that just a few weeks after Musk’s announcement, Tesla sold $3 billion in stock to fix its cash woes

•In July 2020, Musk claimed that “we will have the basic functionality for Level 5 autonomy complete this year” - Tesla vehicles still don’t offer Level 5 autonomy

•In January 2021’s earnings call, Musk said he was “highly confident the car will be able to drive itself with a reliability in excess of humans this year” - that also seems unlikely

11 Likes

Here’s some footage from one doing it today.
Edit: This video actually might be by the Dawn Project too?

Edit: This video actually might be by the Dawn Project too?

5 Likes

Here is a demonstration of very real ongoing safety concerns about Tesla’s system, but the ad just happens be funded by a guy who stands to make a lot of money if the Tesla system is regulated out of existence. The phrase “He has acknowledged some car companies use his company’s software for certain components.” strongly suggests that if O’Dowd’s company didn’t market competing software Tesla’s autopilot problems he wouldn’t have bothered him as much. I’m not saying this is so, cuz I obviously don’t know.

Green Hills Software is a well known vendor of embedded real-time operating systems and associated tools. Their software is used by companies that build automated systems (which is just about every electronic device these days) in many areas, not just automobile companies. It is likely that Tesla itself has licensed their software at some point. The concern here is poor implementation and misuse of such software in automated systems, a problem that is not confined to the automobile industry.

15 Likes

I work in industrial automation. A lot of my career has been spent helping build software that keeps people safe in high-risk situations. Thing is, those high-risk situations are a lot more predictable than an average day in traffic.

People who think this kind of problem can be solved with AI often don’t appreciate that AI is really just statistics. “This usually leads to that”. An unusual this can make the algorithm to predict a really weird and unhealthy that. And as a motorcyclist who has literal skin in the traffic game, I can confirm: There are a whole lot of unusual this events in a really average commute.

It might be made to work on dedicated roads that only allow cars with compatible software; no people on bikes or feet. No motor vehicles operated by a human brain. No access to wandering pets or livestock. Road surfaces under strict maintenance. Doesn’t sound cheap, does it? Maybe there are better ways to make roads safer.

tl;dr: actual safety software professionals have never put much faith in self-driving cars.

42 Likes

But the collision detection system seems to work great!

2 Likes

I initially assumed those promises were pure PR, where he knew it wouldn’t happen, and while that seems to be part of it, it also seems like he’s rather ignorant about how well these things actually work. Reading one particular article about Musk’s personal contributions to the autonomous driving systems was a real eye-opener, as one of his inputs was to insist they didn’t need a bunch of sensors beyond the cameras, because human beings drive only using their eyes… It was at that point that I realized Elon Musk is a fucking idiot, and in this context, a dangerous one.

27 Likes

Wow, it’s amazing that they can function exactly as programmed every time.

1 Like

In May of this year I was entering I-95 in MA to go south. I was turning into the access and a Model 3 that I later found out was using fsd (yes there is a lawsuit). The car came to a stop I turned onto the access road they were at a stop. Half way through the turn past the apex the car accelerates hard and hits me right before the C pillar on drivers side. The Autopilot software did that, not the driver. She was beside herself, I remember looking over and she was looking down on her phone, pretty common and I never saw her look up while the car accelerated into me. She was dumbfounded why her car did that. The state police did the investigation and it went to the dot and nhtsa. I had a couple bumps and bruises, that car accelerated pretty hard off the line. I initially thought she had hit the accelerator but she didn’t. Just my .02 but this sham should be shut down and made to offer returns to the customers who purchased it and haven’t been able to use it. The lawsuit is because although the chassis was bent in the accident Tesla Insurance will not pay what the car books for and what it will cost me to purchase a new one comparable to what I had. I always thought I would like a model s when I retire but I’ve had all I can deal with about Tesla and Elon Musk. I don’t need to enrich his sorry ass.

23 Likes

What number of deaths and accidents do we as a society determine is too many? Or do we just not care because self-driving cars are “cool”

4 Likes

Well that’s horrifying.

3 Likes