I’m reminded of the Uber autonomous vehicle fatality. The system was designed such that it lulled the “backup” driver into a sense of complacency by mostly not needing intervention, but when it did need intervention - because it was designed to not brake for detected objects in the road to avoid false positives - it also was designed such that it wouldn’t alert the driver to that fact, even though the system was aware it was happening. They also apparently failed to tell the drivers it was set up this way (which knowledge might have caused them to be more alert). The result being that a low-level employee (who might have even been a contractor, i.e. “not an employee”) was set up to take the blame of a system fundamentally designed to have an accident that the human in the car would have great difficulty - or impossibility - avoiding.
4 Likes