Messy: When automated anti-disaster systems make things worse, and what to do about it

Originally published at: http://boingboing.net/2016/10/11/messy-when-automated-anti-dis.html

2 Likes

For some things, like flying, I agree: people should be doing all the work.
But for stuff like driving? ha, hell no. There’s a highway that runs through Calgary called the “deerfoot” where at least one person has a serious accident every fucking day, mostly because of tailgaters and inattention.

We all see those people, every day, driving poorly and surviving because other people are attentive enough to keep them alive. Those people should have to pass a regular exam of some sort; something more intensive than the usual one time adult driver’s exam.

1 Like

Computers don’t make mistakes all that often. For the most part, they do exactly as they are told. And the things that do fail, fail so often that there are error detection and correction mechanisms.

Humans who program computers, however, do regularly make mistakes. So do people who analyze the problems the computers are programmed to address.

Never lose sight of the fact that for the most part, computer error is human error: humans and organizations of humans are almost always ultimately responsible for the misconduct of the computer. To impersonally blame “the computer” is more or less a cop-out and the sign of someone is isn’t able, willing, or ready to confront reality, and the attitude probably contributes to the problem.

However, also note that for the most part, the mistakes are inadvertent. Maybe no one put all of the pieces together to figure out the problem. Maybe someone did but didn’t realize the importance or context of what they realize. But it’s still important to find out who is responsible and inform them, so they can correct the problem as much as possible.

At least, that’s my opinion, as someone who fixes broken systems.

1 Like

One problem with self-driving cars is something like 85% of all drivers consider themselves better than average. That’s 85% who will feel less safe in a driverless car.

3 Likes

In the case of Flight 447, the designers of the flight control software had never anticipated a failure of all redundant pitot-static (airspeed monitoring equipment) systems; they simply had not exhausted all the absurd possibilities. And so their system defaulted to a simple catch-all drawer for things that possibly couldn’t happen: just stop ALL automation, sound an alarm, and let the humans figure it out.

When the alarm sounded, the rightly startled humans found themselves in the middle of a severe thunderstorm, with absolutely no indication of what was actually wrong.

The SOP for failed airspeed indicators is to trim the plane to a known pitch and power setting which keeps the aircraft in stable, level flight, and then figure out how to go from there. If the designers of the software had implemented a catch-all scheme that simply trimmed the aircraft and then alerted the crew that “we’ve got an instrumentation failure, but I’ve trimmed the aircraft within a stable flight envelope” then disaster would’ve been averted.

5 Likes

Driverless cars? I’m waiting for passengerless cars.*
 

  • /looks in rearview mirror…Take that, Suzie! Stop hitting your brother! Bobby, I told you no food in the car! Different Suzie, you should’ve gone before we left! That’s it! Honey, hold the wheel. I’m going back there. If I don’t make it back, tell yourself I love you.
6 Likes

I really like the 99percentinvisible episode about this exact story and problem. They also have a followup episode about automated driving.


1 Like

As I read it, Harford’s thesis is that the biggest mistake users make is to trust the computer, where “the computer” includes all those who gave it the instructions. “The computers do as they are told” is no help when you’re not enough of an expert to guess what the programmers might have forgotten to tell your computer/airliner/car/arrest algorithm.

4 Likes

Humans certainly aren’t immune to this(see every ‘bubble’ ever, lethal stampedes where a nice orderly queue could have gotten everyone out in plenty of time, etc.); but one thing that computers’ “Do exactly what they were told to do, very fast” style is not good with is when the orders given produce unpleasant and unexpected emergent behavior.

People can, and all too frequently do, keep merrily marching toward disaster in these cases; but they do sometimes recognize that This Should Not Be Happening and either freeze up or start improvising, which can head off whatever was going to cascade out of control.

Under the same circumstances, computers just cascade out of control. Sometimes this is merely humorous(reseller bots on Amazon getting into little bidding wars with one another), sometimes it goes genuinely badly; but myopically continuing to do the wrong thing is pretty much assured.

That said, the great thing about computer systems (of at least reasonable quality) is their very good resistance to ‘dumb mistakes’. Humans get tired, get distracted, try to multitask, and suddenly they’ve slipped a decimal place, omitted a step, or transposed a couple of characters. Computers are much better about avoiding that. They’ll never do the right thing on their own initiative; but they don’t produce a modest but steady stream of idiosyncratic mistakes.

2 Likes

Sounds like a good fix for this specific problem. Engineering for system failures is very important, especially for systems where failure means life or death. Then again hindsight is 20-20 so we need to admit that problems that weren’t designed for will happen, and often time the root cause is not purely technical.

3 Likes

No doubt the Airbus engineers – in addition to taking measures to prevent pitot-static systems failures – implemented such changes. Also, Boeing and Airbus, prompted by Air France 447, implemented failsafe airspeed inference based on GPS, pitch, power setting, and climb rate inputs.

2 Likes

What about long stretches of mind-numbingly boring trans-oceanic flight? An autopilot can make the necessary, constant, minute adjustments to keep the plane on course without fatigue. A human pilot doing the same task will go slightly off-course and correct frequently, burning more fuel, all while building up mental fatigue. Then, as the plane approaches the destination and things get more interesting (altitude changes, more air traffic, geographical hazards), you’ve got a mentally-fatigued human in charge instead of a refreshed one?

To me, the land-bound equivalent is the real promise of the “driverless” car. It’s not very helpful in crowded, urban, rush-hour traffic. It’s when someone is driving the 552 miles of uninterrupted rural interstate between El Paso and San Antonio that an automotive autopilot would be beneficial.

4 Likes

There is a benefit of computer-created mistakes over human-created mistakes. If you find the cause of the error, you can upgrade all the computers. You can’t upgrade all the humans.

3 Likes

This subject fascinates me and I look forward to reading Harford’s book. But the AF447 story has another important aspect beyond the automated avionics: physical design.

Unlike the old manual control yoke (i.e. the aircraft’s ‘steering wheel’) for each pilot that are physically connected, the Airbus has joysticks, one alongside each pilot. They are completely independent and neither pilot can feel what maneuver the other is attempting, as one would on a linked yoke. In this incident, that turned out to play a significant role in the deadly cascade of errors.

When one pilot is pulling back (directing ‘climb’), and the other pushing forward (directing ‘descend’) as happened in AF447, the flight computer, getting inconsistent inputs, cancelled each out without either pilot being aware of the other’s action. This design feature, which compounded pilot error, figured prominently in the catastrophe.

To read a very thorough account of AF447 see this Popular Mechanics article.

Great choice of the photo from ‘Airplane’ by the way.

3 Likes

I think the article makes the mistake of focussing on the tree and forgetting the forest.

The fate of AF447 was terrible and it inarguable that poor actions by the pilot significantly contributed to that incident. Automation probably played a role in that pilot’s insufficient skills.

However before talking about how all planes should be flown manually it is worth remembering that there are almost 1300 A330s in service, flying every day, a stupendous number of flying hours. From my brief research there have been two fatal incidents due to pilot error.

Removing the autopilot systems would almost certainly result in more incidents. So while it may have helped the people on board AF447 it would only do so by condemning far more to death.

This has to be kept in mind when drawing parallels with car automation. Automated cars will kill people. However the starting point is 1.25 Million vehicle deaths per year worldwide, not zero. So killing a few is arguably ok so long as they kill less than manually driven cars.

1 Like

Bring on the robot cars! Just today: I saw one person stop at a green arrow and a person who stopped in the very small roundabout for no reason and two people go against the rightaway and the person who cut across traffic to rush into the wrong lane. Not to mention myself - there are downsides to having a human in a foul mood operating machinery that weighs over a ton…people are not suited for driving.

2 Likes

Human pilots actually do get upgrades – Crew Resource Management (CRM) is one example which was developed as a result of some Very Bad accidents, and is now required by the FAA. It was a very important development in aviation safety.

Although according to the CRM wikipedia article, Air France 447 pilots had seemingly abandoned CRM:

The men are utterly failing to engage in an important process known as crew resource management, or CRM. They are failing, essentially, to cooperate. It is not clear to either one of them who is responsible for what, and who is doing what.

This stuff is fascinating – lots of questions about how we interact with our technology and each other.

1 Like

This topic was automatically closed after 5 days. New replies are no longer allowed.