Now now, the Second Law of Robotics says that a robot has to obey humans; it doesn’t say anything about signage.
I was reading about their autonomous car program, and some of Musks’s inputs on the development of the technology, and one thing they mentioned is that Tesla dropped the idea of using various sensors because Musk declared that humans could drive by just vision, so the cars should be able to as well. That’s when I realized that, without question, Elon Musk is a complete and total fucking idiot. I mean, sure, human vision is sufficient to drive a car - assuming those eyes are attached to a human fucking brain. And even then, if given the choice of two drivers, one with normal vision, and the other whose vision had been reduced to the equivalent of a single crappy black and white camera, I know who I’d choose…
That’s the guy being given carte blanche to risk people’s lives. That complete fucking moron. I can’t believe what that total arsehole manages to get away with on a regular basis while everyone pretends that his obvious idiocy is smart, somehow.
The same road laws apply to bicycles as apply to other vehicles, at least in every state I’ve lived in over the years.
And people on bicycles tend to not kill people if they run a stop sign. Priorities…
yeah. don’t get me wrong. it’s unjust. i can also imagine why the law that exists might not match up with what’s needed.
though, that said, government regulators are still letting the vehicles on the road in the first place. i wonder how many other countries allow it, or if we’re the one of the few.
[edit: eep. that list you shared actually seems to show deaths from many countries including canada, china, holland, germany, and many more. a world wide effort )
This is going to keep happening as we attempt to automate actual human behaviour instead of the letter of the law.
A large number of humans make rolling stops, just as a large number of humans exceed the speed limit by 5-10mph or fail to signal in empty parking lots or when no one else is around, etc.
What’s interesting here is that I was taught in drivers ed. that it is more dangerous to drive “to rule” and be slower than the traffic around you, than to drive at the same speed as anyone else.
So the question begs to be asked, are we safer if automation drives to the letter of the law, or when it most closely approximates what humans are actually doing around you?
i guess you haven’t lived in every state yet then
The new law’s nickname refers to a similar policy Idaho adopted in the ‘80s — one that states like Arkansas and Delaware have since taken cues from. The bill that created the change in Oregon passed in the waning days of this year’s legislative session
and don’t we wonder why they are putting Elon Musk’s profits ahead of public safety? Sure it has nothing to do with decades of reuglation slashing and attacks on our public regulatory bodies!
Good question… probably one of the few…
This is one of those odd cultural things that will present an obstacle to the uptake of autonomous vehicles.
Drivers are so used to breaking the law all the damn time, that an autonomous vehicle that obeys the laws- does the correct stops, obeys speed limits, gives way when required, and doesn’t jump red lights, that vehicle is going to be perceived as “too slow” despite being (on average) just as fast and even safer (if they get the technology to work).
As you note, yes, a human can because we have this lump of tissue that takes up 40% of our caloric input dedicated to processing that sensory data. Beyond that, we use a lot more than just vision for driving! Finally, just because a human can do it with our limited senses doesn’t mean it’s the best option. That’s the whole reason for using tools in the first place! Let’s see these things with RADAR, FLIR, LADAR, PESA, and whatever other sensor packages you can cram into it. That car should be able to detect when the vehicle next to me starts drifting 3cm in my direction and react appropriately!
This is starting to change (in some locales):
(1) A person operating a bicycle who is approaching an intersection where traffic is controlled by a stop sign may, without violating ORS 811.265, do any of the following without stopping if the person slows the bicycle to a safe speed:
(a) Proceed through the intersection.
(b) Make a right or left turn into a two-way street.
(c) Make a right or left turn into a one-way street in the direction of traffic upon the one-way street.
https://portlandbicyclingclub.com/newsletter/feb-2020-new-stop-sign-law/
EDIT: @gatto beat me to it.
New driving mode proposal: Paranoid. The car refuse to get out of its parking spot. Or to unlock its doors.
So the question begs to be asked, are we safer if automation drives to the letter of the law, or when it most closely approximates what humans are actually doing around you?
The question to solve is “What is the tipping point of automated drivers vs humans” and when can we start holding humans to the letter of the law because it is now possible. I was once told by a local cop that no one can drive more than 1mi without breaking SOME part of the vehicle code – I am sure that is not a huge stretch.
So the question begs to be asked, are we safer if automation drives to the letter of the law, or when it most closely approximates what humans are actually doing around you?
This is a false choice, though: although there are most certainly rule violations that enhance safety, it’s not a given that every accepted or tolerated law-violating driving behavior (notwithstanding its ubiquity) is safer than the law-abiding alternative.
If we are going to permit autonomous driving models that can violate the letter of the law, then they should do so in such a way that produces statistically safer driving. This requires, at the least, some sort of situational analysis to inform the design of the model as to what rules to violate, and how.
Where I live the word “STOP” on a sign is taken to be a friendly suggestion.
Tesla is the #1 car brand in Norway. Also in Norway: absolute right of way for pedestrians. When I first moved there I was freaked out that people would stroll off the sidewalk into the street without looking both ways, confident that they wouldn’t get hit. (After a while I was doing thus too.) I hope this autonomous software has not been programmed to be assertive over pedestrians.
If we are going to permit autonomous driving models that can violate the letter of the law, then they should do so in such a way that produces statistically safer driving. This requires, at the least, some sort of situational analysis to inform the design of the model as to what rules to violate, and how.
This is precisely what I am advocating for, along with the removal of laws or regulations that no one follows as an additional bonus. Society gains nothing when everyone is a “criminal” due to the laws being too strict and never followed - it just allows for bad actors to selectively enforce those laws as an excuse for bigotry or other bad behaviour, and makes no one safer.
And people on bicycles tend not to kill people if they run a stop sign.
Well not other people…