On June 1, 2009, Air France Flight 447 crashed into the ocean on its way back from Rio de Janeiro. 216 passengers and 12 crew died on impact. This month, the official investigation is likely to conclude with “human error” as the culprit–pilots making mistakes that forced the plane to crash. But evidence unearthed by The Telegraph tells a different story, that the pilots of the Airbus A330-200, and everyone else on the plane, were really victims of bad design.
And it’s not just one single component that could have avoided the tragedy. Multiple Airbus-designed systems played a role to create a deadly feedback loop that convinced them to climb to the point that the plane stalled and it fell from the sky. By the time the pilots figured out what was going on, they’d lost too much altitude to point the nose down and gain the speed necessary to maintain lift. Even if you have a very basic grasp of how planes fly, I think the cockpit’s last words from the blackbox transcript explain the problem with eerie conciseness:
02:13:40 (Robert) “Climb… climb… climb… climb…”
02:13:40 (Bonin) “But I’ve had the stick back the whole time!”
02:13:42 (Dubois) “No, no, no… Don’t climb… no, no.”
02:13:43 (Robert) “Descend… Give me the controls… Give me the controls!”
At heart, the problem was one of feedback. In a world of flight dominated by computers, Airbus designs its planes with less tactile response (in the name of pilot comfort) and less potentially overwhelming information (in the name of clearer pilot decisions). In the case of Flight 447, some of the plane’s ducts froze up, removing the information of airspeed, and forcing the plane out of autopilot. In response, a pilot named Bonin pulled up on his stick, gaining a bit of altitude to, presumably, safely keep the plane in the sky.
This stick is of particular note. It’s fly-by-wire technology, meaning that there’s very limited tactile feedback. If a pilot sets the plane to a 10-degree pitch, they can move the stick 10 degrees once and remove their hand from the controls entirely. Furthermore, co-pilots don’t feel any sort of feedback in their controls, meaning that as Bonin was making this momentary adjustment, the only way his colleagues could know was by looking right at his hands.
The plane’s computers eventually gave feedback that the plane was going too high at too great an angle, with an audio alert blaring “stall” 75 times. But without stick feedback, it seems Bonin’s co-pilot didn’t realize that the plane set to gain altitude. Eventually heeding the computer’s warnings, Bonin’s copilot realized that they needed to put the nose down, the plane regained speed and all seemed well. That was, until a proper amount of airspeed triggered the plane’s computers in an unpredictable way. Though the plane was no longer stalling, it began the “stall” alerts again, in essence, causing Bonin to pull back on the stick and force a stall.
The pilots were right to mistrust the plane’s “stall” audio feedback mechanism. They just chose the wrong time to do so.
And if the cockpit were designed to share another vital piece of information, they would have understood these audio alerts on their own. The plane’s angle of attack (the angle air coming at the plane) is measured by the plane’s computers but not shared with the pilots. In fearing information overload, the pilots were missing the one piece of information that could have prevented the accident.
As The Telegraph’s story concludes, flight simulators are already drilling this exact scenario into the minds of new pilots, meaning it’s highly unlikely that the same error will happen again. But why are we training around a design problem rather than taking the steps to fix it? The plane industry is a notoriously self-contained one. Owning to the immense complexity of the challenge, Airbus designs and constructs every aspect of its planes from start to finish. Organizations like that are reluctant to hire out third-party design perspectives–companies like Frog and Ideo–to test for simple human-factor problems.
In this case, fly-by-wire steering obviously has shortcomings in its human feedback. A single-word warning “stall” should probably come with the secondary level of information that the computer is using to draw that conclusion. And, my god, how are pilots expected to keep their planes in the air when they aren’t even told the angle of air currents striking the plane?
The tragic irony is that, if Bonin had taken his hands off the flight stick, the plane would have stayed in the air. So the most immediate, obvious conclusion is to simply blame Bonin for the crash. His actions did crash the plane. His co-pilots overlooked his mistakes. But that conclusion is missing a pretty obvious point: Bonin and his colleagues were all trained, experienced pilots who would have made different decisions with better feedback. We can blame humans for every error in the world, at some level, but the point of good design is to understand human nature, accommodate it, and most importantly, enable it.