Dashcam video of fatal Uber collision released

I could be driving, you don’t kn-

3 Likes

I swear I heard this story on 419 Eater.

I may also mean more people die - who knows. How ever many people die - they shouldn’t have their rights taken away to comfort a very large corporation.

1 Like

This is why self-driving test cars are heavily regulated in California. Uber notably moved their testing to Arizona because the state promised not to be such sticklers about safety. This was after Uber got into a fight with the California DMV because they hadn’t registered any of their self-driving vehicles with the department, claiming that the safety driver’s presence meant that the cars weren’t fully autonomous. The California DMV wasn’t having any of that shit. Rather than admit defeat and pay the $150 per car to register the vehicles with the state and comply with mandatory reporting of every traffic incident, Uber just relocated to a more business-friendly environment.

Also, I’m completely unsurprised that a self-driving vehicle platform with a history of blatantly running red lights (which is what brought Uber’s self-driving cars to the Cali DMV’s attention in the first place) was speeding in a 35 MPH zone and failed to do anything for several seconds to avoid killing a pedestrian.

(Finally, with regard to walking a bike across a street, leaving aside legality it’s often easier to get into/across the street at speed by walking it, versus trying to pedal from a standing start.)

3 Likes

Actually, I’m not at all sure she’d lose her license, especially if she had that dashcam video. Surprisingly few “accident” drivers suffer any legal repercussions. There was a big stink in NYC recently when they tried to make tickets mandatory for city bus drivers who ran over people while making right turns who were legally in crosswalks.

2 Likes

I do say as much in a subsequent post that California has a good grip on driverless testing. Can’t speak for Arizona, but considering what happened i’d definitely split the blame between the state and Uber. Someone shouldn’t have to die before some meaningful change happens.

Agree, but that’s irrelevant because it was the past. At this moment, the safety record now includes one egregious fatality.

I fervently hope that it’s just a one-in-a-million bug in the software (and/or hardware). My fear is that it’s not that at all, but an obvious failure that could (should) have been detected in off-road trials – which were skipped entirely, or just rushed, because OMG-we-need-this-project-done-now-or-we’ll-lose-to-Waymo-Toyota-Ford-everyone.

The video shows a situation that should have been in the butter zone of detectability. The fact that the vehicle drove right into the pedestrian with no detectable braking or steering input is just plain wrong and terribly worrisome. These systems should react in millisecond time, and there should be overlapping layers of self-checks. If some system was briefly off-line, the vehicle should – at the very least – start decelerating and signaling a system failure to the human operator.

2 Likes

Yeah, those tickets can be deadly.

ETA: but a “self driving” vehicle should still have detected an obstacle and avoided hitting it.

How many autonomous vehicles are in operation? How many driving hours have they accumulated? How does the single fatality in that many hours compare with the human record of roughly a hundred fatalities per day in the US and the number of hours that humans drive?

I don’t know the answer, and I suspect that the sample size of the autonomous vehicles is too small for statistical confidence in the result either way. But it would not astonish me to learn that the safety record is still “pretty good” even in light of the recent accident. It also wouldn’t astonish me to learn the opposite - and as I said, the sample size is still likely too small to draw a conclusion.

Humans are so egregiously bad at the task that I suspect autonomous vehicles will outperform them sooner rather than later - but the quest for perfection will keep humans in control long after it would be sensible to cede it.

Yep. This seems like a textbook case of something the car not only should have been able to see, but even see better than a human being would.

5 Likes

It’s extremely clear that two things need to be said:

  1. There will always be “live” testing of new technologies that puts human lives at risk. This is true for literally everything that has ever been engineered throughout all of human history. It’s the ethical responsibility of the ones implementing the technology to be the final end point of anything that goes wrong in the testing. To say that this can all be done inside a closed track is also completely incorrect even in early phases of testing.
  2. It is abundantly clear this was 100% the failing of the AI and the loss of life is not the responsibility of the safety driver, but knowing the current climate in the country the driver will be blamed for what is obviously a AI vulnerability that needs to be explored both internally in Uber and externally with a full investigation. Autonomous vehicles should be pulled off the road during the investigation and the scenario should be replicated artificially and the root caused determined before even thinking about continuing the testing on roads. Those are literally the “standard” to which a situation like this should be held, but we all know Uber will continue to be enabled into mediocrity.
2 Likes

This is a PSA both against distracted driving and for having bike lights, spoke reflectors and reflective clothing. I wonder if the AI would have seen any of those, but a driver with eyes on the road surely would have.

Which points to the total absurdity in the first place of high-speed heavy machinery juxtaposed with the soft flesh of the mammals that create them.

1 Like

3, 2, 1… go

P.S: What happened was definitely Uber’s fault, regardless of the safety driver. As was mentioned by someone else Uber chose not to operate in California because they have more stringent rules for autonomous cars. Now here we are.

5 Likes

Just looking at the released video and noting the number of street lights, this does not look like an actual dark street. Those shadows would’ve had detail on my 2008 car’s backup camera for sure. The released video makes it look much darker than it likely is in real life.

Maybe they’ve chosen cameras that sacrifice light sensitivity for a higher resolution or frame rate? I wonder if the camera video released is a video feed used for navigation or is just a video-review/CYA camera.

1 Like

I presume that’s just the dashcam and its image quality. The sensors on the car would not have been using that, i thought it was using LIDAR?

If you can’t see a pedestrian who is already in the middle of the lane then you shouldn’t be operating your vehicle in those conditions. That is as true for an autonomous car as it is for a person driving. This isn’t some small child darting out between parked cars. It was a full size adult crossing an urban street. It was a 100% foreseeable expectation of the vehicles operation. If it is too dark for your vision system, stay off the road. If the physics of your car don’t grant sufficient stopping distance you are going too fast.

4 Likes

I bet it’s using LIDAR and several cameras. If that’s the case, the detection using LIDAR failed hard as a pedestrian, moving slowly, was not detected on a flat road with no obstructions.

If there were charges and a trial we’d get to see what imaging/detecting they’re doing, but that’ll never happen.

2 Likes

It’s definitely a giant fail if the LIDAR could not see a lone pedestrian on a non-busy street. Only thing i can think of is that the bike confused the car but you’d figure that it should’ve been able to detect and stop for any obstruction on the road.

1 Like

And if you aren’t paying attention to oncoming traffic, you shouldn’t be in the road. It goes both ways. Uber is a shitty outfit, but even with one of their autonomous systems failing, this collision would have been avoided if the pedestrian looked up. Just once.

1 Like