Yes. Idiocy is a given. Its the default in our world. Assume people are dumber than you think and you always end up being prepared and rarely disappointed.
Any feature of a device as complicated as a car should be expected to be abused by lazy, panicked or inattentive drivers.
The question that needs to be asked is:
“How well can we keep this moron driver from killing themselves and other motorists?”
i didn’t see it in the op, but i do wonder what kind of accidents these were and whether driver inattention was part of the problem or not. i could imagine it could be, but there’s also the possibility that the algorithm just goes wrong sometimes. [ like suddenly swerving lanes ]
I think there needs to be a minimal level of engagement with the driving experience, below which accidents increase drastically. For me, being in the steering “loop” is plenty to keep me focused on safe driving (38 years driving, no accidents, no tickets!); the velocity loop is usually handled by cruise control. If I didn’t to actively steer, I’d be asleep within minutes.
Taking the driver out of the active steering loop requires either bulletproof autopilot (which we’re very far from, IMHO) or having the driver be a safe observer, ready to step in and take control immediately when required. Which is apparently 1 second before a Tesla expects impact if I read the article correctly.
I doubt I could maintain that level of focus unless I’m doing something. My experience as a passive passenger is usually sleeping. Which is what the autopilot fanbois think we’ll be able to do soon.
Can confirm. That’s why if you aren’t a fool you keep at least one hand on the wheel and are ready to take back control at any time. Sounds awful, works fine. (Except for some number of the statistics in the article I suppose, but dead men tell no tales.)
I mean, it doesn’t happen often. But it does happen.
Close. It’s like saying someone who grabs some “Parachute” branded napkins to jump out of a plane, after spending the entire flight using them as napkins, and noticing how easily they shred, is a fool. I’m at peace with that characterization.
The analogy the way you worded it only works for a brand-new driver who’s never used the system before.
Nonsense. If the driver doesn’t have an accident while using Autopilot to do their driving they get positive reinforcement and continue to use it that way - at least until they do get in an accident or near-miss, at which point it’s often too late.
None. But yet I’m not deluded into thinking Autopilot is some kind of magic driving robot, which is what your hypothesis predicts I should believe.
The point is that your earlier comment only makes sense if you assume it’s hard to tell Autopilot can behave erratically if not supervised. It is not hard to tell, nor does it take long to work it out – somewhere back in the comments I threw out ten to twenty minutes behind the wheel, and I’ll stand by that.
It’s easy to convince yourself of all kinds of things, by the power of pure logic, if you haven’t tried doing them. Experience counts for something in coming to a well-informed opinion.
But not all drivers or consumers are. That’s why naming the feature Autopilot is irresponsible. None of the other similar features imply the function operates that way. Assuming everyone has the same experience as you is…interesting. You may not die in a fiery Autopilot Tesla crash but you sure seem eager to die on this hill.
Speaking of disingenuous, that wasn’t the position I was taking, now was it? If you look at the post to which I replied, the poster was asserting that cars must be made idiot-proof. I paraphrase, but not by much. There’s considerable daylight between “good luck with that” and “cars can’t be made safer”.
You didn’t paraphrase, you set up a strawman. Maybe you didn’t intend to, but that was the effect. Hyperbole eliminates the middle ground where, I agree, the useful solutions lie.