Dashcam video of fatal Uber collision released

Yeah, that’s the philosophy that got us Thalidomide being prescribed for pregnant women. Math and economic-based approaches like what you’re citing cannot take into account the human lives ruined by bad drugs. For all its flaws I’ll take the current approval system over the alternative.

6 Likes

This is it. An absolutely epic fail from the AI. If it was working as designed, it needs to go back to closed course development until it is much, much better at this.

I’m really angry. I thought they were far further ahead than what this tragedy would demonstrate. As a cyclist, I’m looking forward to as much assistance as possible from these systems to help stop motorists running us down. Seems I’ll be waiting longer than I thought.

9 Likes

It does, actually. It enumerates them explicitly. It’s your approach that cannot take into account the human lives ruined by lack of access to good drugs. I agree that the lived experience of individuals harmed by bad drugs makes those individuals more aware of what’s causing them harm, and I’ll concede that I don’t think anyone tries to capture that psychological harm. But the number of lives unnecessarily lost is not small.

There are clear cases, right now, where we’ve known for decades that some treatments are harmful, and what would be needed to improve them, but the improved version isn’t FDA approved and no one has enough incentive to get it approved, so people just keep dying for no reason.

Edit: I’m just realizing how off topic I am here. If you’d like to continue this discussion, please move to another thread or PM me.

1 Like

I’ll judge harshly. I think this clear evidence that they aren’t ready for on-road use. It simply failed what was close to the simplest pedestrian avoidance test one could design. This circumstance should be successfully managed 100% of the time before they’re ready for the road.

7 Likes

I can’t speak for Uber or other companies, nor can i speak for Arizona. I know California has been very strict on self-driving vehicles, but honestly this is something that will always have inherent risk. Just how new medication has to go to clinical trials and can potentially harm people the same goes for autonomous tech.

That being said a single death is not acceptable, i do want to see the technology improve though so it’s a knife’s edge to walk. How quickly do we want to get there and how much risk are we willing to accept?

3 Likes

Some possibilities:

  1. It doesn’t matter - if she’d been walking without a bike she wouldn’t have been any faster, and walking across roads is something people do
  2. Flat tires happen
  3. Broken chains happen
  4. She doesn’t have lights and wishes to avoid the danger and risk of tickets for cycling at night without lights
  5. There’s no curb ramp at the far side of the street and she doesn’t want to dent her wheels trying to jump a square curb

It’s not a marked crosswalk - but it looks to me like there’s a T intersection. Look at the lights to the left. If I’m not mistaken, that intersection makes it an unmarked crosswalk. EDIT it’s not a crosswalk as @sp3cialk29 notes, but the intrastructure there is totally insane. There’s a footpath that leads to the road, where there’s a fence in the median to stop people crossing, and clearly communicate that they shouldn’t cross. Then a little ways down the road the fence stops, and there’s a brick footpath through the median which clearly communicates “cross here” - despite its being no safer a place to cross than where the fence is. It’s nuts.

According to this article Uber crash: Video shows moments leading up to fatal collision :

A large median at the site of the crash has signs warning people not to cross mid-block and to use the crosswalk to the north at Curry Road instead. But the median also has a brick pathway cutting through the desert landscaping that accommodates people who do cross at that site.

So, there are signs (which you can’t read in the dark), and yet the spot is obviously one where people want to cross due to whatever streets and paths lead to it. And, while you can’t read the signs at night, you can read the obvious physical accomodation for crossing there. Sounds like disastrously bad mixed messaging between the built environment and a sign tacked on rather than actually fixing the message the environment conveys.

Also from the article:

Last fall, Uber officials showing off their vehicles on Mill Avenue said their radar and lidar were able to detect objects, including jaywalkers, as far as 100 yards away and avoid collisions.

EDIT: Here’s a daytime photo of the site in daylight, looking off to the left toward where the victim was coming from

I’m speechless at just what stupid city planning that is - there’s a well built, inviting footpath, that it is literally impossible to reach without jaywalking (it does not connect to any crosswalk anywhere), with a little sign saying “don’t use me”. The presence of a footpath speaks far louder than any sign possibly could. If the city really didn’t want people walking there, they’d have built a FENCE not a PATH.

12 Likes

I think the goal was that they would be cheaper.

1 Like

They both did.

1 Like

What’s more important is that with a human driver there’s at least someone who has responsibility and liability for their actions. Who do you blame for an autonomous car’s failure? The development team lead? The individual developer who wrote the specific line of code which failed? The company itself or it’s board? See if there’s no one held legally responsible for whatever happens with the vehicle it doesn’t really matter if it’s automated or manually driven. So long as there’s a legal blind spot this will always be a problem. And the usual response to saying these companies need to be held liable at minimum is that this will impede innovation as if innovation is more valuable than a human life. This is something that we as a society need to tackle and stop assuming technology is perfect and that liability should never be factored in any venture for fear of “losing innovation.”

2 Likes

There is nowhere I will ever need to get so badly as to accept the risks of getting into a self driving car.

1 Like

Speaking of the 800-lb Gorilla in the room.

4 Likes

This is why I think automated cars are the wrong idea for a basic problem. If you got a long commute why not support mass transit? Whether it’s bus or train or both there’s less risk to everyone when mass transit is used. Plus it means you don’t have to stress yourself out on a crappy commute (God I do this every day between Minneapolis and Roseville to my job… FML).

6 Likes

I mean you got SV folks that want to live forever being juiced up on teenager blood. I mean we already got vampires/ghouls in real life. So I guess the disdain for human life is baked in now.

4 Likes

If I wanted to short an investment in the last 50 years, it would be the value of human life to the american people. We’re not worth much to us.

4 Likes

The only factor here that I see a normal driver doing different in this lighting condition is using their hi beams. With the regular headlights, a person wearing a black top and darkish jeans at night is pretty dang hard to see.

Still haven’t seen if a toxicology report has been done on Ms. Herzberg. She doesn’t even look up once that I see in the video. The more I read her story, the more tragic it gets.

Blame can be useful, but ultimately I care about safety, not blame. In practice, I think the likely answer is the same as for any other driver - whoever agrees to insure the vehicle has to pay. Let the actuaries work out the relative risk and be held accountable. Mandatory insurance laws already insulate drivers from their own bad behavior. It’s the insurers that pay, then charge the bad drivers more until they either get better, stop driving, pay more, or get their licenses taken away for any number of reasons.

I’d like to see a real, rigorous process that needs to be followed, legally, before letting non-human drivers on the road. After that, I’d say that if an autonomous car is not affordably insurable, no one will buy it, and that’s plenty of evidence that it is unsafe.

2 Likes

Playing devil’s advocate here fyi: But the economic incentive to get fully autonomous vehicles is huge and it goes beyond public transportation. The commercial sector is prime for the taking, but definitely having the numbers work out for public transportation is also attractive to Uber and other companies.

1 Like

If I have to keep monitoring the road at all times while “driving” an autonomous car I can think of no reason to use one. Especially if slows and stops (at the behest of the liability attorneys) for all sorts of things that I can easily identify as non hazards.

5 Likes

Oh no doubt that half of our problems with the US driving situation is the fact we’re far too lenient with drivers.The fact you rarely see even repeat DUI offenders lose their license permanently (even in some cases where someone dies) is in itself a huge problem. So I’d love to see a toughening up of all penalties when it comes to traffic laws. It would just be part of the whole automated car process as well.

3 Likes

Especially since the human driver would presumably need a moment to realize that the autonomous system wasn’t reacting the way it should. Even with an alert copilot you end up with a situation where the reaction time is slower than either a machine or a human alone.

4 Likes