Au contraire. Any software in a modern car goes through an industry standard safety process. The classifications found in ISO 26262 are useful for applying to components in your car, with some aspects of self driving falling under ASIL D, the highest level of classification. On top of this, SAE defines levels of automation for a car. If you have a lower automation level, like “hands on” (Level 1), then much of your so-called self-driving car becomes a much lower ASIL level. And you get issues like people letting their Tesla drive for them when it’s not really capable of hands-off/eyes-off/mind-off (Level 4) driving.
Interesting. Recall also the collision happened at 10 pm.
In broad daylight a cautious driver or self-driving car might spot the pedestrian and give extra space, but it might be hard for either to recognize the situation well after dark.
For now I agree that testing these on public roads while they are still improving them seems like they are pushing their (our) luck.
But it is only a matter of time before self driving car’s safety exceeds that of human drivers. At that time, human drivers on public roads will be the ones that need to go find a private track, like horses, to play.
But like I said before, self driving car does not equal commercial vehicles only. Right now it does because they are too expensive, But this will follow the same course as electric cars. Remember the original ones that only CEO’s and movie stars could afford them? Over time they will get better and more full featured while at the same time the price will drop until they are affordable for regular drivers.
Proof please
We can both play that game.
Proof please that it’s impossible, ever in the future of humanity for a self driving car to be safer than a human?
Neither of us have a crystal ball.
Like I said, I don’t think they are safe enough today and tests on real roads seems like an unnecessary risk before they prove themselves on some sort of private test city roads. The bar should be high. Higher than what a 16 year old should have to do to pass. And there should be some standard federal guidelines for testing, not just company X says a car passed their internal tests.
If they are never safer than humans then they should never be on the road. I have no problem with that.
If they do in time exceed human safety levels do you still have a problem with them? if so then I think your complaint about safety isn’t genuine and is really more along the lines of gun hobbyist that think their love of their favorite pastime is more import than the safety of those around them.
Edited to add: There is nothing wrong with being passionate about manually driving a car. I used to live out of the city and often had the opportunity to drive stupidly fast on open empty roads. It’s fun and thrilling (and dangerous). Now I live in Seattle and commute everyday into the downtown area and I hate driving. There is no joy in moving forward 3 feet every 5 minutes for an hour or more. I take the bus everywhere for the past year and wonder why I pay car insurance for a car I haven’t driven in months.
What you type is not unreasonable, and even admirable. However, it’s not going to be the big picture ideas, practices and goals that matter in the end. It’ll be whether or not some jury can be persuaded that some “individual contributor” is on some level at fault. If and when that’s accomplished, then the precedent is set.
In the real world, this is how it’ll likely work out:
-
autonomous car hits pedestrian, biker, or some other car and someone gets killed because of it
-
person behind the wheel (but not driving): “it wasn’t my fault, I wasn’t warned by the computer!”
-
cops: “duh…we dunno what to do. Here’s a ticket for obstructing traffic.”
-
car manufacturer: “don’t look at us!”
-
software company: “probably was a hardware malfunction. Definitely not our algorithm.”
-
tech lab: “well, uh, the car’s computer was unfortunately incinerated in the fiery mess, so, uh, we’re not sure what caused this.”
-
family of the deceased (X-number of years later), persuaded by a crack legal team: “this is insane! clearly this was a software problem. We’re suing!”
-
lawyers representing software company: “our clients said it wasn’t their fault. we’re suing everyone else involved to get this figured out once and for all!”
And that’s where the deep, deep, deep dive begins into “who, really, is at fault!?” To be continued, for sure.
Humans will be genetically engineered by then obviating any purported need for driverless cars
Cool, so nothing for you to be afraid of then.
Hey - neither may come to pass in my lifetime.
Also - I live right in the middle of center city. No open roads. My transport usage includes cars & public transport in equal measure.
True for me as well. And just to be clear. I’m not pro- or anti- self driving cars.
I just know that commuting sucks and would like better options that are safe and affordable. What form that takes doesn’t matter to me. I do know that the best solution varies based on location and person. As a tech person public transport is fine for me even though I have quite a walk from my bust stop to my work. Other professions need to transport equipment that a bus doesn’t allow. I’m about using the best tool for the job whatever they might mean. In the kitchen id rather use an electric blender than a hand cranked mixer. But then something new like the Juicero comes along and it worse than a basic old fashion juicer. Which bucket the self driving car belongs to only time will tell ultimately.
I forgot to mention shoe - shoe is a marvel of transportation in the city.
Agreed. I walk a couple miles everyday on both ends of my commute (Sun, rain or snow). I was just debating if its time to take the heavy shoe spikes out of my bag and someone said it might snow tonight. Crazy weather.
After flying to Phoenix for the first time, I followed the signs in the airport to a shuttle pick-up point. Travelers had to cross a road to wait for it. It was late and dark, so I stood at the crosswalk and judged by the oncoming headlights of a car that it would be safe to cross after it passed by. Only it didn’t keep moving, it started slowing down. By that time it reached the crosswalk, more cars were coming in other lanes, and they did the same thing. I was horrified because they all stopped, waiting for me to cross the road.
I’d probably watched Death Race 2000 too many times as a kid, because crossing that road was hard to do with my instincts telling me to run in the opposite direction.
Real life can be so freakin’ weird.
You can look at the Toyota unintended acceleration case if you’re looking for the engineering details and what happened in real court rooms instead of your hypothetical assumptions. https://users.ece.cmu.edu/~koopman/pubs/koopman14_toyota_ua_slides.pdf
By “engineering details” do you mean the text on slide 46:
• 2007 e-mail internal Toyota e-mail says:
• “In truth technology such as failsafe is not part of the Toyota’s
engineering division’s DNA.”
But, yeah, part of the reason corporations exist is to protect individuals from liability.
Interesting stuff, thanks.
. . . and the drunk!
What? Why are you looking at me like that? A couple of drinks; that’s all I had! Point zero eight percent isn’t shit; you know that.
Ah, go to hell, the lot of you! I drive good drunk!
I’LL FIGHT YOU!! COME ON!!!
Yes, it’s just KathyPadilla, claiming over and over again that when companies put autonomous vehicles out on public roads, it amounts to privatization. Several people have observed that this is nonsense, but she is a deflector of Mike Pence’s stature, and she just pretends they said something different and keeps chugging along. I suspect she’s a bot, which is both deeply ironic and flat-out hilarious.
There are two lessons here.
- Don’t make claims in company emails that lawyers will find during discovery.
- Follow industry safety standards instead of just winging it as Toyota basically did.
It was a very expensive mistake for the company. And now that we know more the expectations have changed and are much more strict and the punishment is going to be much more severe.
Companies aren’t trying to avoid liability to save a few bucks. I think the whole industry is trying to avoid going out of business on the next bad legal case. VW’s $15B settlement on just lying about environmental regulation software is a sign of how we’ll probably see lawsuits go in the future, especially if companies deviate from accepted standards or try to deceive regulators and courts.
Very quickly we’ll have self-driving cars on the road for a collective billions of hours. And just statistically speaking there will be fatalities. Maybe even due to software faults, because even that process is not perfect. (well it can be perfect, but the standard is NOT perfection)
you mean something some guy put on an internet forum, pulled form a science fiction book? may NOT BE REAL???
i am SHOCKED! Shocked I tell you! and amazed.