Cop pulls over Google self-driving car, finds no driver to ticket

I heard he shot Larry Page later the same day.

The google employee was in the car to ensure the safe operation of the vehicle, regardless if there was no steering wheel or not. So i would assume they are directly responsible for it, though ultimately it’s Google’s car so the case can also be made that the company itself can be ticketed.

I’m definitely interested in seeing how legislation will handle these kinds of issues revolving around autonomous cars.

1 Like

If you look at how Tesla handles their self-driving software they say that the driver must absolutely keep watch on the road and must be able to assume control of the car. So in this instance it does align with my previous thinking. But in cars where the passenger really has no control over the vehicle, or cannot be expected to provide assistance, i’m not so sure on liability.

This also brings up the concept of “If an accident occurs and the possibility comes up that a) the car saves you but a bus full of children die or b) the car kills you and spares the bus. Should the car make this decision? Who becomes responsible for either decision?”

There’s a lot of complexity behind autonomous cars for sure beyond software :sweat:


Once in awhile police around here do issue tickets to people going 30 on the 45 roads. I wish they did it more frequently. :wink:

1 Like

Autonomous anything should not be subject to human laws. Humans laws directly apply only to other humans. People wanting to bother other people about what autonomous machines do is a desperate waste of time.

Typically you don’t expect it to go:
no violation -> traffic stop -> everything’s OK!
it’s supposed to be:
no violation -> no traffic stop

But, you know, America I guess.


Ultimately these machines are built around human interaction on a large scale so legislation is definitely important. If we’re talking about industrial automation that’s a different thing, but when you talk about cars i would expect there to be some clear rules. Also these machines would be services that are designed to make some people money, part of the needed legislation would also have to be in place to protect consumers.

Tesla’s “autopilot” is more of a glorified cruise control system. It specifically does not replace the driver, but instead automates part of the job. The driver is still in charge.

Google’s cars DO replace the driver. A taxi passenger would have no control except for an emergency stop button, and need not be paying attention at all.

And no ticket was issued -- not because there was no driver to whom to issue it but because the car had committed no violation.

Sure that’s why no ticket was issued. Not because the cop knows how to issue a ticket to a human whether there has been a violation or not, but not how to issue a ticket to a robot.

I mean yes, that would be the ostensible reason - saying otherwise would be confessing to professional misconduct.

1 Like

Yes, and this is one reason I’m unhappy with Tesla’s approach (and I hear the Google Self-Driving folk aren’t too happy about the precedent Tesla’s setting either).

Musk is dead-set on quickly pushing out a cool feature that they can brag about, and is a big cause of the internal turmoil at Tesla’s autopilot team over the last six months. Their approach is also very different from Google’s project: their sensor system is less robust and they want to be able to run it at higher speeds but in comparatively simpler situations. As a result they have to compromise and require the driver to be aware and ready to react… but I think we agree that this isn’t a good strategy.

1 Like

I want to know what the UI for the passenger looks like. How did they pull the car over? Did it respond to the siren or is there a menu for unexpected stops? How does it handle pulling to the side when fire truck comes by?

I’m sure others have brought these issues up, I’m just wondering how they handled it.


So-called “consumers” could also be taught to wake up, and simply not buy stuff that exploits them. There are reasons why this isn’t done, and instead default to putting somebody else in control. This can be seen as two competing forms of exploitation which both disregard the individual.


Without knowing more I can’t say why the officer pulled it over, but there are good plausible reasons. Driving 25 in a 35 mph zone is uncommon, and usually means nothing - someone older, or searching for a house or sign - but it can also be a result when the driver is intoxicated. Extremely cautious driving (which is what Google’s cars would look like compared to typical human behavior) can (but shouldn’t, oh well) seem suspicious to a cop, as seeming like the driver is trying too hard to look innocuous.


Quick blow more air into the autopilot!


As others have mentioned, too-slow can be as bad as too-fast. This Jalopnik article (and comments) fleshes out more of the details.

1 Like

Its also possible that the cop noticed the car (it does look odd), and may have noticed no driver. So they used the speed as an excuse to pull the car over and figure out if someone was driving the car recklessly or if it was something else entirely. Imagine the cop’s surprised when it was something else entirely.

1 Like

They can say what they like, but humans are not actually capable of that. Not sure what the courts will make of it.

Send the car to robot court


Thing is, this was in Mountain View near Rengstorff & El Camino, very close to Google’s headquarters and less than 2 miles away from the Self-Driving Car project’s central garage. Google has done a good job providing public information about these cars (including multiple town hall meetings), and I expect that, considering the amount of self-driving they do close in the area, they’d be working closely with the local PD whom the officer belonged to…

So what was he concerned about and what was he hoping to accomplish?

Edit: here’s the Self-Driving Project’s post about it, and Mountain View Police Department’s, which has all the info:

This afternoon a Mountain View Police Department traffic officer noticed traffic backing up behind a slow moving car traveling in the eastbound #3 lane on El Camino Real, near Rengstorff Ave. The car was traveling at 24 mph in a 35 mph zone. As the officer approached the slow moving car he realized it was a Google Autonomous Vehicle. The officer stopped the car and made contact with the operators to learn more about how the car was choosing speeds along certain roadways and to educate the operators about impeding traffic per 22400(a) of the California Vehicle Code. The Google self-driving cars operate under the Neighborhood Electric Vehicle Definition per 385.5 of the California Vehicle Code and can only be operated on roadways with speed limits at or under 35 mph. In this case, it was lawful for the car to be traveling on the street as El Camino Real is rated at 35 mph.

The Mountain View Police Department meets regularly with Google to ensure that their vehicles operate safely in our community.