… and how human error begets automation, which in turn increases human error.
One of the most harrowing things you’ll ever read.
… and how human error begets automation, which in turn increases human error.
One of the most harrowing things you’ll ever read.
I hadn’t read a full account of this plane crash before. Thanks.
“Whenever you solve a problem, you usually create one. You can only hope that the one you created is less critical than the one you eliminated.”
The same situation occurs in medicine. There are infusion pumps which automatically feed a given amount of drug, say for pain management, into the patient at a certain speed. Nurses are trained to read the numbers on the pump, and this constitutes their knowledge of it. While not a new procedure nowadays, it is relatively new to particular hospitals.
When nurses first encounter this, they shy away on the premise of 1) not understanding how to operate the pump, 2) not trusting it to work correctly, and 3) losing contact with reading the patient’s true pain level. (All of these points are valid.)
But the longer the nurse works with this auto-feed, the more s/he comes to rely on it – and if not attentive, will often forget to assess the patient: his actions, signs and symptoms of any distress, or even listening to the patient’s concerns. (Often they, too, don’t trust the pump, or start to rely on it too much.)
This sequence will surely occur with cars on auto-pilot as well, especially with inexperienced drivers, drivers who don’t pay attention, or drivers who over-estimate their driving ability. Automation itself – including the use of robots – is only a tool to ease the load; it’s not a substitute for learning, paying attention, or getting the job done right.
Amen, Pat. A great problem in so much of our lives. When I was a young nurse, we took care of patients, but when I retired 30 years later, most nurses were spending the greater part of their time with machines and paperwork. Patients get a lot of value when medical professionals use those tools wisely, but reliance on the tools without human interaction eliminates much of the CARE. Nothing can replace the independent observation, assessment and judgment of a truly competent nurse who does CARE – interacting with the human being. And that’s the part we’re losing.
It’s a good article and an entertaining story, which is what it was meant to be. I enjoyed it, but I didn’t come away thinking I understood the crash any better than I did before.
The author strips away a lot of conflicting and contradictory details. That makes the story easier to understand and more readable. That’s good for an infotainment article, but bad for understanding the causes of this tragedy.
I’ve studied this crash intensely. It is simply not credible to assert that three experienced pilots did not recognize that they were in a deep stall. One might conceivably be fooled, but not all three.
What the CVR and data records don’t show is what the pilots were being shown on the displays. A lot of the contradictory stuff stripped from the story suggests the displays were at a minimum deeply confusing, and very possibly showing grossly inaccurate information. We just don’t know. The culture of both Air France and Airbus doesn’t help here, nor does the liability should bad displays be an important factors.
The points about automation and complacency are valid. I agree with most of the conclusions of the story. But I keep in mind that it is just a story. It fits neatly with those facts the author selected. But it doesn’t fit nearly so well with the totality of the contradictory, confusing evidence.
-s — I know you have the experience to see more deeply into this. But if all three pilots did understand they were in a stall, why wouldn’t they take the obvious, well-understood actions to pull the plane out of it before it was too late?
one thing that seems to be clear is that the fact that both the first officer and the pilot were trying to control the aircraft, and that one of them was trying to put the plane in a dive, which is unintuitive to someone who is not thinking clearly, since the last thing logic tells you is to point the plane at the ground…in this case, diving the plane is EXACTLY what needed to be done. One of them panicked, and was commanding the plane to climb, the intuitive thing to do, but the WRONG thing. AND on the times that one of them managed to get the plane in a lower pitch attitude, the STALL warning came back on…it had gone off once the pitch was higher than the programming had been setup for. This would be horribly confusing to somebody who thought the plane was in a stall, and had managed to lower the attitude. The panicked guy was feeding huge corrections into the plane and commanding a high pitch up. The other guy could not ‘feel’ his huge overcorrections from the unconnected flight controls, and thus know that the guy was utterly panicked so he could do something about it.
I give the pilot who came to the scene in the middle of the crisis a bit of a pass, since he had seconds to try to understand what was going on, and the info he was getting was contradictory.
Another thing is that this was a ‘fly by wire’ plane. The flight controls are not directly connected to the control surfaces on the plane (these days, though, they are ALL this way). The flight control is an input to a computer that then decides how much to move the control surfaces. In a directly connected plane, there is a ‘feel’ to the controls. A decent pilot, even a fairly inexperienced one, can easily tell the difference between a plane that is flying normally, and one in a deep stall.
There was NOTHING wrong with the plane. The pitot tubes were plugged, but the GPS was still working. Airspeed info was available, just not presented well. Pitot tubed DO plug up.
So terribly sad. So unnecessary.
Airbus has a decades long history of at the minimum confusing its pilots into fatal situations. Once is happenstance; twice is coincidence; three times is a toxic marriage of arrogant engineers and arrogant pilots.
I won’t fly Air France again, and don’t like flying Airbus. Then again, my Jeep doesn’t grope me and so I don’t fly much at all any more.
And if you think this is bad, just wait until the Transportation Department really pushes autonomous cars. Arrogant engineers combined with arrogant bureaucrats …
Clearly the system engineers made a mistake if putting the nose down caused the stall alarm to come ON, and putting the nose up caused it to go OFF. They must not have tested at those high angles of attack, or foolishly (it turns out) assumed a pilot would never get there.
Almost every computer program and computer hardware has many bugs that turn up in unusual conditions. Such conditions are extremely hard to test (where the bugs would normally show up and be solved) because there are so many variables. You’d have to spend many years checking out corner cases. (I spent much of my working life as a test and systems engineer.) Taking years testing this stuff is not going to happen because it slows going to market and lets your competitors beat you.
I have stopped flying because of TSA. Now, after reading that story, I have another reason not to fly.