I heard he shot Larry Page later the same day.
The google employee was in the car to ensure the safe operation of the vehicle, regardless if there was no steering wheel or not. So i would assume they are directly responsible for it, though ultimately itās Googleās car so the case can also be made that the company itself can be ticketed.
Iām definitely interested in seeing how legislation will handle these kinds of issues revolving around autonomous cars.
If you look at how Tesla handles their self-driving software they say that the driver must absolutely keep watch on the road and must be able to assume control of the car. So in this instance it does align with my previous thinking. But in cars where the passenger really has no control over the vehicle, or cannot be expected to provide assistance, iām not so sure on liability.
This also brings up the concept of āIf an accident occurs and the possibility comes up that a) the car saves you but a bus full of children die or b) the car kills you and spares the bus. Should the car make this decision? Who becomes responsible for either decision?ā
Thereās a lot of complexity behind autonomous cars for sure beyond software
Once in awhile police around here do issue tickets to people going 30 on the 45 roads. I wish they did it more frequently.
Autonomous anything should not be subject to human laws. Humans laws directly apply only to other humans. People wanting to bother other people about what autonomous machines do is a desperate waste of time.
Typically you donāt expect it to go:
no violation -> traffic stop -> everythingās OK!
itās supposed to be:
no violation -> no traffic stop
But, you know, America I guess.
Ultimately these machines are built around human interaction on a large scale so legislation is definitely important. If weāre talking about industrial automation thatās a different thing, but when you talk about cars i would expect there to be some clear rules. Also these machines would be services that are designed to make some people money, part of the needed legislation would also have to be in place to protect consumers.
Teslaās āautopilotā is more of a glorified cruise control system. It specifically does not replace the driver, but instead automates part of the job. The driver is still in charge.
Googleās cars DO replace the driver. A taxi passenger would have no control except for an emergency stop button, and need not be paying attention at all.
And no ticket was issued -- not because there was no driver to whom to issue it but because the car had committed no violation.
Sure thatās why no ticket was issued. Not because the cop knows how to issue a ticket to a human whether there has been a violation or not, but not how to issue a ticket to a robot.
I mean yes, that would be the ostensible reason - saying otherwise would be confessing to professional misconduct.
Yes, and this is one reason Iām unhappy with Teslaās approach (and I hear the Google Self-Driving folk arenāt too happy about the precedent Teslaās setting either).
Musk is dead-set on quickly pushing out a cool feature that they can brag about, and is a big cause of the internal turmoil at Teslaās autopilot team over the last six months. Their approach is also very different from Googleās project: their sensor system is less robust and they want to be able to run it at higher speeds but in comparatively simpler situations. As a result they have to compromise and require the driver to be aware and ready to reactā¦ but I think we agree that this isnāt a good strategy.
I want to know what the UI for the passenger looks like. How did they pull the car over? Did it respond to the siren or is there a menu for unexpected stops? How does it handle pulling to the side when fire truck comes by?
Iām sure others have brought these issues up, Iām just wondering how they handled it.
So-called āconsumersā could also be taught to wake up, and simply not buy stuff that exploits them. There are reasons why this isnāt done, and instead default to putting somebody else in control. This can be seen as two competing forms of exploitation which both disregard the individual.
Without knowing more I canāt say why the officer pulled it over, but there are good plausible reasons. Driving 25 in a 35 mph zone is uncommon, and usually means nothing - someone older, or searching for a house or sign - but it can also be a result when the driver is intoxicated. Extremely cautious driving (which is what Googleās cars would look like compared to typical human behavior) can (but shouldnāt, oh well) seem suspicious to a cop, as seeming like the driver is trying too hard to look innocuous.
Quick blow more air into the autopilot!
As others have mentioned, too-slow can be as bad as too-fast. This Jalopnik article (and comments) fleshes out more of the details.
Its also possible that the cop noticed the car (it does look odd), and may have noticed no driver. So they used the speed as an excuse to pull the car over and figure out if someone was driving the car recklessly or if it was something else entirely. Imagine the copās surprised when it was something else entirely.
They can say what they like, but humans are not actually capable of that. Not sure what the courts will make of it.
Send the car to robot court
Thing is, this was in Mountain View near Rengstorff & El Camino, very close to Googleās headquarters and less than 2 miles away from the Self-Driving Car projectās central garage. Google has done a good job providing public information about these cars (including multiple town hall meetings), and I expect that, considering the amount of self-driving they do close in the area, theyād be working closely with the local PD whom the officer belonged toā¦
So what was he concerned about and what was he hoping to accomplish?
Edit: hereās the Self-Driving Projectās post about it, and Mountain View Police Departmentās, which has all the info:
This afternoon a Mountain View Police Department traffic officer noticed traffic backing up behind a slow moving car traveling in the eastbound #3 lane on El Camino Real, near Rengstorff Ave. The car was traveling at 24 mph in a 35 mph zone. As the officer approached the slow moving car he realized it was a Google Autonomous Vehicle. The officer stopped the car and made contact with the operators to learn more about how the car was choosing speeds along certain roadways and to educate the operators about impeding traffic per 22400(a) of the California Vehicle Code. The Google self-driving cars operate under the Neighborhood Electric Vehicle Definition per 385.5 of the California Vehicle Code and can only be operated on roadways with speed limits at or under 35 mph. In this case, it was lawful for the car to be traveling on the street as El Camino Real is rated at 35 mph.
The Mountain View Police Department meets regularly with Google to ensure that their vehicles operate safely in our community.