Sometimes my humor is dark. That was of course a morbid jokeā¦I mainly found it humorous that you added cute to the equation.
Imagine the horror if it was wealthy private school kids from a school whose name ends in the word āAcademyā!
If it were me as a human driving, Iād personally prioritize a bunch of kids over my own safety, i think i am biologically wired that way, that and I have children of my own so nature has imbued me with āprotective parent modeā which kicks on automatically around any kids in danger.
1 Like
Thatās what I meant by āonlineā. I assume the car needs a network connection occasionally to work properly and failing a checksum could get you banned from the road (and possibly arrested).
A checksum can be faked. Just run two controllers, one that runs the car and the other that thinks it is running the car (and possibly runs in a virtual environment, some simple simulation, a dream paradise for a controller) and supplies the checksums. Or just hack the checksum providing subsystem too. You may need a crypto key sunk somewhere in the silicon, but that may be accessible via side channels, e.g. differential power analysis.
ā¦Or tear it right from the silicon itself. Another thing that needs to be in every or at least every other garage is an electron microscope with focused ion beamā¦ ā¦and weāre back to my earlier call for an affordable system for vacuum hackingā¦
Iām sure this is upthread somewhere, but the best answer to this question is not to ask.
Autonomous cars will work best when all road traffic is autonomous, until then anyone who trusts their autonomous car completely is a nimrod and a high % at-fault.
Srsly, until all people are off the road (as drivers) the accidents that occur are their decision. So the effective answer is if an autonomous car is in a no-win situation and the occupant fails to override in any manner then the car should self-sacrifice and let the doctors/lawyers sort it after the fact.
As for the rich car vs poor car, remember itās a no-win situation. If a rich guy fails to override and his car is programmed to kill others to save him, is that so different than what heād do if he did override? Because if it is, he would, and if it isnāt, he could go either way, him run over the poor people, his AI run over the poor people, itās all good he can afford the settlements.
Same for the poor guy, except his only option is to override in order to take out others to save himself. His AI aināt gonna do it cause he couldnāt spring for that option, he goes to jail/sued into oblivion if he does it, dead or maimed if he doesnāt. Difference without distinction? in adding an autonomous car in the mix?
But if all road traffic is autonomous without exception, sure maybe fog/ice/moose will still cause some road carnage, but the question posed wonāt come up much in a refined interconnected system.
Why would the car be programmed to kill me when it could just dial 911?
1 Like
Uh?
If you are paying for the car it should be your proxy, that would mean it should look after your welfare in a reactive manner.
1 Like
what if its a zipcar, and the pedestrian is also a zipcar memeber?
At the moment the pedestrian is not the service user, therefore is exempted.
1 Like
this could be helpful with collectionsā¦
This topic was automatically closed after 5 days. New replies are no longer allowed.