Waymo self-driving vans having "trouble turning left"

That map of jobs by state was very interesting. I think I found the source:

https://www.npr.org/sections/money/2015/02/05/382664837/map-the-most-common-job-in-every-state

Looks like the truck driver preponderance is more about how the census slices other job categories more finely. But who knew there were states where secretary was still a thing. And that Utah is a tech hub.

1 Like

Yes, the category doesn’t include only long-haul drivers but also things like local delivery drivers (who make a lot less money than the long-haul guys who’ll be the first to be automated out).

2 Likes

I have seen a waymo self driving car pause for about 30 seconds at a 4 way stop while other cars obviously got annoyed.

I wonder if their safety weightings factor in the road rage engendered from being overly conservative.

They pretty much have to be ultra-conservative because every time one is involved in a minor accident you have newscasters leading with the story of the death robots plaguing our streets.

1 Like

And I suppose if the robots get attacked by torch carrying humans with road rage the robots get sympathy points :slight_smile:

I think that a large part of the problem, and also with the four-way stop sign problem @bolamig mentioned, is that human drivers rely on watching the other driver far more than the self-driving car proponents have acknowledged in the past. Whether it’s an explicit eye-contact and nod, or just an implicit looking to see if the other driver is paying attention, we are always forming mental models of the other drivers.

Other kinds of communication exist as well: the explicit flashing of lights to signal that one driver is letting the other one in, and the implicit slowing down that does the same thing.

Obviously, this will be solved when >70%, or whatever, of the cars are autonomous and can talk to each other with their soulless digital voices, but until then the car is going to be deaf to all these forms of communication and so has to act only on near-certainties, like a huge space in traffic, or no one at an intersection.

3 Likes

What you are saying is that these cars need a system like this:

3 Likes

I think the concern is that when a person has an accident it’s because of that one person, if an autonomous car causes an accident all autonomous cars of that mfr. are potential accidents. I mean remember the “unintentional acceleration” kerfuffle that Toyota went through? Think of that times a thousand.

Except that autonomous vehicles need to be aware of cyclists, pedestrians and all sorts of other hazards it can’t talk to.

Backing into a loading dock is actually easier for a computer to do then dealing with unpredictable traffic. It is hard for humans to do because maneuvering a trailer in reverse screws with our perceptions.

(that said I haven’t really backed into a loading dock, I’ve backed an RV into a site, and I’ve backed a pickup truck into a loading dock that wasn’t very busy, so maybe a loading dock in full swing has enough unpredicatableness going on that it could be an issue)

IMHO we’re setting the bar too high for autonomous vehicles. They don’t need to be perfect, just better than humans on average for now. That’s a fairly low bar, and unlike with humans we can raise the bar over time by making the autonomous systems smarter and smarter. Eventually we could have automobile travel that is nearly as safe as flying.

Honestly, I’m a bit skeptical that AVs have even collectively attained the bar of “not worse than humans”, and the scenarios that they have been doing well in are hardly representative of the breadth of driving conditions that such vehicles would face if they were deployed nation-wide. I’ve yet to see anyone outside of a university research lab talking about solving the challenge of driving in weather conditions other than those that occur in Arizona and Southern California. (MIT is working on a solution that uses ground-penetrating radar to keep cars in their lane when the markings are either missing or covered in snow, but even that requires mapping literally every lane of every road in the country first to be 100% effective.)

Computers are already better than us at the things we’re bad at – paying attention, making snap decisions – but they have yet to get as good as us at the things we’re good at – object recognition, understanding human behavior, non-verbal communication. On top of this left turn problem, just this year there have been reports that GM’s Cruise program has problems with excessive braking because it’s throwing false positives all over the place – even treating road signs as potential obstructions. Uber notably killed someone a few months ago when their software failed in the opposite direction, rejecting the hardware’s detection of a pedestrian walking their bike across the street. Tesla’s autopilot lane-assisted someone straight into an interstate exit ramp’s crash barricade at speed. (Tesla, at Musk’s insistence, is also rejecting anything but visual-spectrum cameras in the development of its full AV platform, which is problematic given that said cameras have previously failed to notice things like semi trucks turning across the lane of travel.) This stuff simply is not ready for prime time, and probably won’t be for quite a while. Certainly not in the 5 years being promised by the luminaries in Silicon Valley. We might have fully-qualified AV packages that will work on the interstate by then, but certainly not fleets of them crawling around city streets.

Of course, I’m also hesitant to embrace a driverless car future that simply swaps human drivers with computers, because it does nothing to address the fundamental underlying problem that our cities are built around cars to the exclusion of pretty much every other form of transit. While we’re probably never going to completely eliminate personal point-to-point vehicular transportation, I don’t see any reason to further entrench it.

I also worry about how well AVs will play with more aggressive forms of traffic-calming, like greenways and other mixed-use roads where the whole point is to eliminate separations between vehicles and pedestrians, encouraging drivers to pay more attention and slow the eff down. It’s hard enough to ask an AI (or, more accurately, a black box of probabilistic algorithms that nobody really understands the inner workings of) to predict pedestrian behavior when they’re “safely contained” on the sidewalk and crosswalks. If AVs can’t be relied on to navigate these spaces safely and effectively, what will the response be? Given the history of car supremacy in this country’s transportation planning, I worry that broadly-deployed not-good-enough AV technology will only undo the limited progress being made to reclaim streets for other forms of transportation, as planners seek to “protect” pedestrians and cyclists.

2 Likes

Re: low bar

If it was, then why don’t L5 AVs exist yet, what with "strong AI’ being “just around the corner” for the last 50 years or so?

This topic was automatically closed after 5 days. New replies are no longer allowed.