Dashcam video of fatal Uber collision released

An alternative reading of the same data- cars killed 5,987 pedestrians in 2016 (0.189 deaths per 100 million miles driven). Uber’s cars have killed pedestrians at roughly 180 times the rate of a typical human-driven car.

ETA: That means that for Uber’s cars to avoid pedestrians as well as human drivers do, they need to drive another 500 million miles without killing anyone.

2 Likes

Billions of dollars of tech valuation hang on the belief that AI can make cars drive themselves.

I’m beginning to think it’s pretty much bullshit.

Excellent analysis of the problem here:

4 Likes

Is that a function of Uber’s software as compared to Google’s (or anyone else’s) or could the backup driver have something to do with it?

http://www.12news.com/article/news/local/valley/convicted-felons-are-allowed-to-work-for-uber-in-arizona/75-530456348

I’m gonna keep repeating this because I think it’s important to understanding how shitty Uber’s driverless cars appear to be*: the vehicle was doing 38 in a 35 MPH zone. Yes, the speed limit used to be 45, but it was recently lowered, and a driverless car must be capable of reading and interpreting speed limit signs to stay within the legal limit, or it’s functionally useless. How did it “know” how fast it was supposed to be going? Did someone program the speed limits throughout Tempe manually? If so, how in god’s name is a driverless car supposed to be remotely responsive to changes in speed limits around the globe, either statutorily or as a result of temporary road work? If not, how does Uber justify their the fact that their driverless vehicle was speeding at the moment it struck a pedestrian? “Doesn’t speed” is literally one of the biggest bullet points on the “what makes driverless cars safer and better than humans” list (right below “is infinitely more attentive than humans are capable of being” and “doesn’t rely solely on the visible spectrum to identify obstacles”, which also appear to be problems for Uber).

*Additional case in point: back when an Uber car ran a red light in San Francisco, Uber originally blamed the safety driver for doing it while in manual mode. Then, thanks to discovery from the Waymo lawsuit, documents emerged which showed that the car was, in fact, in autonomous mode and it just flat out didn’t recognize that there was a traffic light there (as well as in 5 other locations where the lights weren’t red at the time). Notably, there’s also a pedestrian entering the crosswalk on the right in Uber’s red light run, which the vehicle also fails to slow down for:

6 Likes
5 Likes

I would love for you to explain how you think a backup driver might compel an autonomous vehicle to run over a pedestrian.

Agreed. The only “acceptable” accident would be something where a human would not have been able to avoid collision. Eg, a person dashing out from between two parked cars.

But this is an uncomplicated issue of a jaywalking pedestrian. If you cant figure that one out, you’ve no business being on the road.

4 Likes

A backup driver who isn’t paying attention can allow an AI vehicle to screw up and kill someone. I thought that would be obvious.

Hell, we’ve have had well paid and well trained transit and railroad operators kill passengers because they weren’t paying attention to their jobs.

Volvo XC90s are supposed to come with a pedestrian detection system. If it was disabled by Uber, Uber is as fault. If it was active, and failed, it may be Volvo’s fault.

Edit: found these conditions on Volvo’s website:

The following conditions apply:

In order to detect a pedestrian, the system must have a full view of the person’s entire body and the person must be at least 32 in.(80 cm) tall.
The system cannot detect a pedestrian carrying a large object.
The camera’s capacity to see a pedestrian at dawn or dusk is limited, much as it is for the human eye.
The camera’s function is deactivated and will not detect a pedestrian in darkness or in tunnels, even if there is street lighting in the area.
WARNING
Pedestrian and Cyclist Detection with Full Auto Brake is designed to be a supplementary driving aid. It is not, however, intended to replace the driver’s attention and judgement. The driver is always responsible for operating the vehicle in a safe manner.
The system cannot detect all pedestrians in all situations, such as in darkness/at night and cannot detect partially hidden pedestrians, people who are less than approx. 32 in. (80 cm) tall, or people wearing clothing that obscures the contours of their bodies.

1 Like

I’m not sure what someone’s status as a felon has with their ability to remain attentive, though. It’s not like they cut that part of your brain out when they convict you. Yes, it was stupid of her to not be paying attention, since that was literally her job. No, I don’t think that has anything to do with her being a felon, nor do I think a non-felon would be more likely to remain perfectly attentive. Uber apparently used to require two people be in the car together - one to monitor The Algorithm from the passenger seat, and the other to watch the road - but they dropped that requirement, presumably to double the number of miles they could get out of their available pool of safety drivers. A second passenger still may not have been able to prevent the collision, but a co-pilot keeping the safety driver honest and having the responsibility of monitoring the AI’s decision-making reports in real time certainly wouldn’t have hurt.

California requires companies to report within 10 days when an autonomous vehicle is involved in a traffic incident, as well as report annually on any circumstance in which the safety driver intervenes in the operation of the vehicle while in autonomous mode (section 227.50, pdf). Since everyone but Uber is operating under California’s rules, and there have been no news reports of safety drivers for Waymo, Google, Apple, or GM preventing those vehicles from careening into pedestrians, I think it’s safe to conclude that the safety driver is not some sort of thin blue line keeping autonomous vehicles from killing people on the regular.

3 Likes

I’ve now watched the video a few times. Between the poor quality—where is the top half of the view shown in the videos on youtube, including street lights—and the poor lighting, viewers cannot tell how fast Ms. Herzberg moved or whether she looks in any direction. What you see is the back of her head after impact.

I was hit by a car when I was in third grade. It was after dark in the winter. His head lights were off. There was a 35 mile per hour speed limit. I looked before crossing from his left to the right but when I was in the right lane, I saw a gleam—the street light playing on his hood. I didn’t look a second time but instead did my best to finish crossing. He clipped my hip with his passenger fender, throwing me into a snow pile. I have no idea of his speed or the distance when I saw the gleam of the street light on his hood, maybe three street lights away?

I do know that different people react differently in a situation. Maybe she already looked, maybe she didn’t. Maybe she registered the car with her peripheral vision and just tried to finish crossing because what other option did she have? But from the video, I cannot determine that she never looked and I don’t think anyone else can.

3 Likes

Proceeds to write a good half-dozen posts blaming the victim and her circumstance, and minimizing the responsibility of the people operating the massive vehicle. Thoughts and prayers, all around.

Here’s the thing: a city is full of a wide variety of people, in various circumstances, all the time. Some of those people are on foot, some of them are on bikes, some of them are even in strollers. If that fact isn’t acknowledged at a very fundamental level of your heuristics, then what’s the point of swapping out the person driving the car?

Here’s a reality check for autonomous cars: they will probably be utilized for freeway traffic only, and require a hand-off to a human hands when operating on city streets, for the foreseeable future. Liability is going to remain in the hands of the vehicle operator, which means insurance companies are going to determine who gets to use autonomous cars and who doesn’t.

2 Likes

I suppose so, except that if this safety driver had been doing her job, autonomous cars would still have a perfect zero-fatality record, @beanolini wouldn’t have seen a need to run those numbers, and in fact this whole thread would not even exist.

BTW, should we slow-clap Uber for taking a chance on hiring felons?

You seem remarkably determined to make this the fault of someone who was once convicted of a crime, as if that conviction should permanently bar a person from ever being able to participate in society ever again. Why is that?

4 Likes

Not at all. I would not hire a convicted embezzler to work in a bank, but I would have no problem making that person a safety driver. I would not hire someone with a vehicular manslaughter rap as a safety driver, but might do so to work counting money.

I’m fine with Uber hiring whomever they want, as long as that person is able to do the job. And part of the job of “safety driver” includes “watching for bicyclists”.

1 Like

The safety driver had been convicted on a charge of attempted armed robbery.

The fact that she did a bad job is not related to her felony conviction. Plenty of non-felons are terrible at their jobs too, and it’s been rather scientifically proven that humans are inherently incapable of sustaining peak attentiveness for hours on end (that, again, is why autonomous cars are supposed to be better). So why did you bring up the fact that the safety driver is a felon like it was some kind of causal link?

3 Likes

Look, one of Mr. Bechizza’s posts reference “victim blaming”, so I accepted that particular frame. But my particular views on autonomy as individuals is that we ALL share some level of responsibility. If we feel moved to petition Congress (even though I live thousands of miles from these experiments, fuck that noise), we collectively are all guilty for allowing this one particular woman to die. Along with the two individuals who die every second on this planet.

Anyway, she was close to the scene, she was a grown ass woman who made particular choices that evening. Her choices contributed to her death. The more research I do, I find her particular plight heartbreaking. I also think Uber is a shitshow of a corporation. In a jury, I however, would feel unable to convict on criminal charges without seeing her toxicology report.

As for the little “thoughts and prayers” jibe, my liberal thought aligns more with the original Bolsheviks, Kalanick should be dealt with like a Romanov. Fuck you for personal aspersions in a somewhat rational debate.

1 Like

An actual human being is dead. The car is not the victim here.

3 Likes

And hundreds of people, between the autonomous vehicle engineers to the urban developers of this town share some level of fault here. So when the Uber employee responsible for that car is fired and goes homeless, is he/she the victim then, or just the culprit. Or maybe that Uber employee deserves prison from your view? You, I, and everyone else die, sometimes in horrible situations. If she looked right, she probably would have lived.

Dude. She’s dead. Let it go.