I can’t believe they would even consider not stopping. Having said that, when I was growing up in PA in the 1970’s, my 10th grade social studies/civilization teacher, said that the law on the books in PA had never been updated from the early days of automobiles, and that it defined “stop” as slowing down to the point the spokes in the wheel could be counted. Of course, he pointed out that the police always interpreted “stop” at a stop sign as ceasing motion, but he suggested that if anyone got a ticket for a rolling stop, one might contest it and pull up the definition on the books. Of course, then you’d have to hope that your hubcap had something akin to “spokes” and even then…
Bicycles are permitted to ride on the shoulder, whereas motor vehicles are not.
I was once stopped by a local police officer for having a flashing headlight on my bike. He was completely mistaken in telling me it was prohibited. In Ohio, only motor vehicles are specifically mentioned for that prohibition.
Didn’t claim to. My implied point was related to bicyclists who complain about inconsiderate drivers or drivers who don’t drive around them safely, while themselves not following the rules of the road which can also lead to hazardous situations.
So basically Tesla’s stock value hovering at about $1000 a share has nothing to do with the financial strength of the company or quality of its products. But is a function of manipulation and hype.
There is a third effect, but I can’t decide on its value: the law is never enforced as a primary offense but is used in determining fault when something goes wrong. The law describes the safest behavior that can reasonably be expected. If someone decides that’s too conservative, they’d better be right, because while they may have driven that way for years, it’d be a sign of recklessness in case of an incident.
I can’t imagine someone being smart enough to program self-driving cars, yet also deliberately programming them to run stop signs and thinking “This is a good idea. What could possibly go wrong? I’m a genius and life will only improve for myself and my employer.” You don’t hire a flat-earther to design rockets, how did THESE folks get hired to program cars?
Tesla to disable ‘self-driving’ feature that allowed vehicles to roll past stop signs at junctions
According to recall documents on America’s National Highway Traffic Safety Administration’s website [PDF], Tesla has agreed to disable the feature via an over-the-air (OTA) software update. It is expected to begin deploying the update to affected vehicles in the next week or so. […] According to the documents, this covers the 2016-2022 Model S and Model X, 2017-2022 Model 3, and 2020-2022 Model Y vehicles.
The Associated Press reports that Selected Tesla drivers are effectively being allowed to beta test the Full Self-Driving software on public roads, as long as the owner enables the function. Once enabled, the Tesla would be able to go through an all-way stop junction without coming to a complete halt. It would have to be traveling below 5.6 MPH (9 KPH) while approaching the junction, so long as no moving cars, pedestrians or bicyclists are detected nearby. […] Tesla has previously insisted that its vehicles are not capable of fully autonomous driving: “The currently enabled Autopilot and Full Self-Driving features require active driver supervision and do not make the vehicle autonomous.”
The thing is that, as of now, the “Autopilot” isn’t really one - and not every driver realizes this (or doesn’t care) and uses the feature accordingly. They also don’t realize that Tesla is most likely off the hook in the event of a fatal crash - but they are not: Last month, a Tesla driver was charged with vehicular manslaughter after a crash in Los Angeles in which the vehicle’s Autopilot mode was engaged.
It was a totally idiotic decision. For starters, the goal should not be to be as good as humans — it should be to be much, MUCH better than humans. And to accomplish that, you need to employ whatever sensors you (practically) can to allow it to cope with low-visibility situations, fast-approaching objects at odd angles, etc.
It’s interesting that Musk decreed camera-only is sufficient, while Argo does this: I know which one I’d rather share the road with.
Therein lies the fundamental problem Tesla has from a business standpoint. Whatever novelty and innovation they may have, will be overtaken by the larger auto makers by dint of their massive advantages in economy of scale. Other automakers can improve upon, outpace and out produce Tesla. More importantly they have the deeper sales and customer service infrastructure. Tesla is notorious for shitty service.
Yeah, exactly. It’s idiotic on multiple levels. What always was exciting to me about autonomous driving projects was that they could pull in data from sensors that had no human equivalent, to give the system information about what was going on that could make it a better driver (at least in specific circumstances) than a human being. Information it would absolutely need if it was to have any chance of making up for the lack of a human brain being behind the wheel.
Yes! Human drivers are incredibly poor - it’s not exactly a lofty goal. To then hamstring everything so even reaching that bare minimum becomes impossible - and then, even worse, to not even realize that’s what you’re doing… it’s baffling to me that anyone thinks Musk is smart. He just seems like a perpetual example of the Dunning-Kruger effect to me.
I wonder if Musk made the decision not so much because he thought it was sufficient, but because of cost or looks. Apparently there was a sensor that he redesigned because he didn’t like the look - but the redesign caused it to not work at all in icy weather. (Apparently they went straight from Musk’s redesign to production without any testing, too?) Musk’s whole approach is always that of a mediocre programmer who is used to working on unimportant systems that have no real-world impact, and he can’t escape it. It doesn’t seem like it even occurs to him to try, and the attitude is widespread in the company, too. Make some changes for the heck of it, throw them out there to see if they work, if someone dies, change them again…
It’s so Tesla (and Musk). Poorly thought out (with obvious UI issues), forces drivers to contradict their learned instincts (and becomes dangerous as a result), done mainly because it looks cool…
Get a bunch of “tech industry” people like Musk who are used to working on non-critical systems, who want to “disrupt” existing systems and put them in charge of automobiles where “move fast and break things” becomes literally true…
On a visit to Brazil years ago, my wife and I were freaked out when our friend who lived there was driving us around at night and repeatedly would just slow down at intersections marked with stop signs and if clear would blow right through them. Turns out it was legal after a certain hour because of the number of car jackings where people had been jumped once they stopped the car. When in Rome.
The company has far deeper pockets than the driver. So when one of these maims or kills it is going to make far more sense to sue the company than the driver. Programming it to disobey traffic laws, no matter how minor makes the plaintiff much more likely to win in court. You would think that one of the investors would have told the techbros that.
And stop me if I’m wrong here, but I thought the Teslas all had wifi and were constantly being updated from the mothership. Why do they have to be physically “recalled”? Isnt this just a software update?