Driverless car in San Francisco almost runs over moms and kids (dashcam video)

Originally published at: Driverless car in San Francisco almost runs over moms and kids (dashcam video) | Boing Boing

6 Likes

If your vehicle turns into a deadly killbot every time it encounters cellphone reception issues then it doesn’t belong on public roads.

44 Likes

They rely on cellphone networks? Yikes. I hope they geofenced off the Broadway tunnel.

15 Likes

As always the answer to the question “who could imagine driverless cars run by taxi companies would lead to our streets and cities becoming a no go zone for pedestrians, cyclists, mobility tools users, public transport users, and eventually also private car drivers” is - absolutely fucking everyone. Especially the people running and financing those companies. They have no other plan. It’s the only way to justify their valuation.

24 Likes

“…a real-life trolley-car problem” I don’t think that means what you think it means.
Glitchy Cruise. Scientology programming does that.

3 Likes

There should be an enhanced self defense allowance for protecting yourself or others from a buggy AI.

2 Likes

Perhaps blind pedestrians will be permitted to carry hand grenades.

(As in X Marks the Ped-walk, Fritz Leiber, 1916)

6 Likes

these cars are not ready to be out there. and humans are not ready for these cars. i am not anti-innovation but driverless vehicles are a BAD IDEA.

9 Likes

Who gets the traffic citation?

4 Likes

I don’t think they’re inherently a bad idea. Driverless cars are potentially far safer than cars driven by humans (who do stupid things like driving while tired or drunk). Whether they’re to that point yet, I don’t know. I haven’t seen enough data on their safety record. I wouldn’t conclude for sure they’re not based on this one incident, though the reports of mass glitching is concerning.

Traffic citations are the wrong mechanism here. I think what we likely need is something akin to what the NTSB shows for airline safety incidents. When something doesn’t work correctly, you need to analyze why it didn’t work and fix it. Then, the learnings from that investigation need to go into standards that everyone needs to follow.

7 Likes

The real question is why are these giant metal boxes with no one driving them allowed on our streets before their safety is proven? Is it mostly because fuck pedestrians, wheelchair users, and cyclists?

Thanks for this. Have they done it? Did they get removed from our streets, tested, fixed, tested and re-tested and only then put back out there? Because it sure seems like that should have happened. Do you think someone should be held responsible for these things damn near running over women and kids, causing traffic jams, blocking other vehicles, etc ad infinitum?

Maybe their research should just be allowed to continue on our streets; they can do it all as they see fit, rather than taking everyone else’s safety into consideration.

14 Likes

Until a self-driving car can do EVERYTHING that a safe driver is legally required to do then they are a bad idea.

That includes, but is not limited to, things like

  • Adapting to unexpected road conditions
  • Following instructions from humans directing traffic
  • Driving safely with or without cellular/data/GPS reception
  • Responding appropriately to emergency vehicles
  • Pulling over out of the traffic lane to let passengers embark and disembark safely

…and other tasks these machines apparently haven’t mastered yet.

15 Likes

https://www.gutenberg.org/files/52776/52776-h/52776-h.htm

… oh jeebus the cars are dependent on phone networks to make safety decisions?

Regulators should not be OK with that :face_with_symbols_over_mouth:

9 Likes

I would say that, inserted somewhere near the front of that process, would need to be “the entire network needs to halt operations until the problem is identified and corrected.”

11 Likes

this-is-so-correct

4 Likes

Some humans seem all too ready…

3 Likes

No disagreement there, though with the caveat that it needs to be able to do those things while maintaining a safety record at least as good as the average human driver. They shouldn’t be on the road if they’re more dangerous than human drivers, but we also shouldn’t hold them to an impossible standard.

In at least some cases, that’s undoubtedly true, just like NTSB will sometimes ground a particular kind of aircraft.

1 Like

That is only going to work if there is a clear line of similar accoutability. In other words, if a robotaxi kills someone due to negligence, there needs to be the exact same accoutability for the crime as if a human driver were behind the wheel. Otherwise it’s just another techbro way to whitewash accoutability from individuals to an ultimately unaccoutable corporate entity.

If the robotaxi in this case had killed a kid, the repurcussions for the CEO should be the same as if they were behind the wheel. Otherwise, we the public should not consent to this.

18 Likes

giphy

4 Likes