Recently driving through the deserts of Utah in between crossroads with nobody in sight my wife and I came across a horrific accident. A camper van had plowed into a guardrail protecting a shallow wash. The camper van was completely demolished. Pieces of wreckage were all over the highway and a well used plush love seat was upright on the center line.
Amazingly two truckers had stopped and were attending to the passengers and were directing traffic and placing hazard flares on the road. The code of the highway was in effect. Helping your fellow man in need.
Would driverless cars and trucks make a decision to stop and be of assistance upon arriving at at a crash scene? Would they call in for help? Would they unlock to provide shelter? Would they provide medical supplies, food and water?
Or, would they just keep driving down the highwayâŚ
This. Stopping is only helpful if the vehicle or its occupant has the ability and willingness to assist. If the passenger says âCar, pull over, i need to help these people,â it better damn well stop.
The last accident I witnessed happened directly in front of our car and missed us by a couple inches (small SUV t-boned, then flipped and spun by a Malibu that didnât even slow down for the stale red light it shouldâve stopped for). Didnât have much choice but to stop where we were, autonomous vehicles are supposed to recognize detritus on the road (glass and plastic all over the road immediately before us), right? I think it still wouldâve stopped in this situation.
Yes, I assume that driverless cars with passengers should and would be subject to certain overides by said passengers. But I donât imagine that passengerless vehicles should or could be, as that would certainly translate into liability for the vehicle owners.
Would driverless trucking companies be required by law to stop and assist? Or would the manufacturer of the driverless truck be required to program the truck with a Guardian Angel Routine?
Also, i know i say this in every driverlrss car thread, but letâs all remember: no matter what choices we let driverless cars make in these situations, the net result for society is fewer collisions and fewer deaths and fewer injuries etc. Because humans, all humans, yes even you and me, are not wired to be good drivers.
I find it really difficult to understand how either of those arguments justify not stopping to give assistance.
I can see arguments based on the truck not being able to do much to assist other than call in the accident.
I gather the US does not have much in the way of laws requiring people to assist in emergencies where they can, so I can see itâs not much of an issue there but Iâd be interested to know how these things would be implemented where it is a criminal offence not to stop and help.
If Iâm in a self-driving car, does it stop automatically if it spots an accident? Do I have to tell it to stop? What if the car thinks thereâs an accident and I tell it to carry on? Does the car tell the authorities? Should it?
Do you know of any places where it actually is a criminal offense not to help?
My cursory glance at âDuty to Rescueâ seems to show that most english speaking countries do not have laws that hold people liable for failing to rescue except in âspecial relationshipsâ. Caretakers, spouses etc, fall under this category, and other relationships like employers and employees and so on. The other case is if you caused the accident.
I honestly donât think a car should stop automatically, or punish people for not stopping. What if the car stops and I further injure or kill an accident victim? What if there was nothing I could do, or Iâm not mentally able to assist? Why would I be required to stop in a driverless car, when I wasnât when I was driving?
The best possible solution to me is a reporting system in the crashed car as well as in a passing car that contacts authorities. If a person would like to stop, they should. But I donât like the idea of making everyone and anyone rescuers by law.
A human in the loop utilizes learned ethics and morals. Stop if one wants to help or keep on going if one doesnât want to be bothered. No law should force a person to be an angel. Even a doctorâs Hippocratic Oath is just that, an oath. However, If Iâm stranded on the side of the road next to my smoking heap of what used to be a vehicle bleeding, hungry and cold Iâd like for an autonomous vehicle to stop for me.
Sure, and I imagine most people who saw you would stop. But do you think unmanned vehicles should? Should they stop and allow you into their cargo area? Should they be required to have a space available for stranded people?
Yes. At the very least report the accident. There is an opportunity to act as a kind human would and try to help. Autonomous vehicles IMO should serve as a lifeboat to those in need. Could be as simple as providing access to a First Aid Kit or food and water. Access to space for transport and shelter makes sense but, I donât see a autonomous truck hired by say a food distributor to allow that from ever happening. The idea feels right however.
Yeah, I donât disagree, but I also donât think itâs really practical. Reporting should be a given, but stopping? Iâm not so sure.
Autonomous vehicles are not kind humans or lifeboats. They are more like cargo train cars, in my opinion.
If they are required to stop, how long should they stay stopped? Will they be able to tell if the person in the accident is concious, or if they donât want to leave, or if they have already left? Will companies be compensated for delays in their shipments or comprimised cargo? And if so, by whom?