×

Dudes, we know this is an A380. Shutterstock has its limitations.

Co.Design

How Lousy Cockpit Design Crashed An Airbus, Killing 228 People

New evidence shows that a lack of pilot feedback from the cockpit controls led to the crash of Air France Flight 447. What led to such a design disaster?

On June 1, 2009, Air France Flight 447 crashed into the ocean on its way back from Rio de Janeiro. 216 passengers and 12 crew died on impact. This month, the official investigation is likely to conclude with “human error” as the culprit--pilots making mistakes that forced the plane to crash. But evidence unearthed by The Telegraph tells a different story, that the pilots of the Airbus A330-200, and everyone else on the plane, were really victims of bad design.

And it’s not just one single component that could have avoided the tragedy. Multiple Airbus-designed systems played a role to create a deadly feedback loop that convinced them to climb to the point that the plane stalled and it fell from the sky. By the time the pilots figured out what was going on, they’d lost too much altitude to point the nose down and gain the speed necessary to maintain lift. Even if you have a very basic grasp of how planes fly, I think the cockpit’s last words from the blackbox transcript explain the problem with eerie conciseness:

02:13:40 (Robert) “Climb… climb… climb… climb…”
02:13:40 (Bonin) “But I’ve had the stick back the whole time!”
02:13:42 (Dubois) “No, no, no… Don’t climb… no, no.”
02:13:43 (Robert) “Descend… Give me the controls… Give me the controls!”

At heart, the problem was one of feedback. In a world of flight dominated by computers, Airbus designs its planes with less tactile response (in the name of pilot comfort) and less potentially overwhelming information (in the name of clearer pilot decisions). In the case of Flight 447, some of the plane’s ducts froze up, removing the information of airspeed, and forcing the plane out of autopilot. In response, a pilot named Bonin pulled up on his stick, gaining a bit of altitude to, presumably, safely keep the plane in the sky.

This stick is of particular note. It’s fly-by-wire technology, meaning that there’s very limited tactile feedback. If a pilot sets the plane to a 10-degree pitch, they can move the stick 10 degrees once and remove their hand from the controls entirely. Furthermore, co-pilots don’t feel any sort of feedback in their controls, meaning that as Bonin was making this momentary adjustment, the only way his colleagues could know was by looking right at his hands.

The plane’s computers eventually gave feedback that the plane was going too high at too great an angle, with an audio alert blaring “stall” 75 times. But without stick feedback, it seems Bonin’s co-pilot didn’t realize that the plane set to gain altitude. Eventually heeding the computer’s warnings, Bonin’s copilot realized that they needed to put the nose down, the plane regained speed and all seemed well. That was, until a proper amount of airspeed triggered the plane’s computers in an unpredictable way. Though the plane was no longer stalling, it began the “stall” alerts again, in essence, causing Bonin to pull back on the stick and force a stall.

The pilots were right to mistrust the plane’s “stall” audio feedback mechanism. They just chose the wrong time to do so.

And if the cockpit were designed to share another vital piece of information, they would have understood these audio alerts on their own. The plane’s angle of attack (the angle air coming at the plane) is measured by the plane’s computers but not shared with the pilots. In fearing information overload, the pilots were missing the one piece of information that could have prevented the accident.

As The Telegraph’s story concludes, flight simulators are already drilling this exact scenario into the minds of new pilots, meaning it’s highly unlikely that the same error will happen again. But why are we training around a design problem rather than taking the steps to fix it? The plane industry is a notoriously self-contained one. Owning to the immense complexity of the challenge, Airbus designs and constructs every aspect of its planes from start to finish. Organizations like that are reluctant to hire out third-party design perspectives--companies like Frog and Ideo--to test for simple human-factor problems.

In this case, fly-by-wire steering obviously has shortcomings in its human feedback. A single-word warning “stall” should probably come with the secondary level of information that the computer is using to draw that conclusion. And, my god, how are pilots expected to keep their planes in the air when they aren’t even told the angle of air currents striking the plane?

The tragic irony is that, if Bonin had taken his hands off the flight stick, the plane would have stayed in the air. So the most immediate, obvious conclusion is to simply blame Bonin for the crash. His actions did crash the plane. His co-pilots overlooked his mistakes. But that conclusion is missing a pretty obvious point: Bonin and his colleagues were all trained, experienced pilots who would have made different decisions with better feedback. We can blame humans for every error in the world, at some level, but the point of good design is to understand human nature, accommodate it, and most importantly, enable it.

[Images: Cray Photo, Cray Photo, and TEA via Shutterstock]

Add New Comment

47 Comments

  • IT WAS NOT DESIGN

    Yeah, and never mind they never followed the specific procedure for unreliable airspeed, which is also a memory item.  It tells them what to do and it's simple - power to CLIMB, nose up 5 deg.  TURN OFF the Flight directors.  Was it the faulty flight directors info that "misled" the crew to go up?  If you're not going to follow the procedure and turn them off, at least don't use them with all those ADR faults (recognized by the crew).  That's just awful incompetence. 
    It is true had they done nothing, this would have been a minor non-incident.  That's what most crews do (instead of following the procedure).  But put the nose up so high, at such high altitude, with a very heavy plane in unfavorable weather conditions, go over the altitude they knew was out of reach (mentioned PF Bonin just minutes earlier)?  They still had their Altitude and Attitude Indicators working.  PNF Robert notices the climb in the beginning ("according to this you're going up.  According to all three you're going up" from the CVR), he just never followed through.  PF Bonin, btw, is scared sh*tless by the weather before captain Dubois leaves for his break.  But captain Dubois doesn't seem to see or care and all he cares about is his break  He shows he wants to avoid being a captain and making decisions regarding weather avoidance or strategy.  And this RIGHT AFTER he realizes he has made a mistake and has entered a bad storm (AF447 being the only flight not to divert right away that night).  PNF Robert would later discover the captain's radar was not set properly to show the weather ahead.
    There was a very similar and previously unreported incident with a TAM A330, in 2003 I believe.  The PF pullded up just like Bonin and started stalling.  The difference is, once they heard the STALL warning, the crew RIGHT AWAY acted and put the nose down.  That''s why their incident is something minor and unknown until AF447.
    However, there are always ways to improve the tech-pilot interface, especially thinking in long-term.  But it is not feedback.  Pilots used to make A LOT MORE mistakes in uncomputerized planes that had all the feedback in the world.
    And lastly, they never got away from the stall.  The STALL warning reappears when the speed reaches 60 knots, not enough to support the plane at any angle of attack. So it was just the beginning of the right course of action.

  • Sara Jamms

    BTW, in the "previously unreported incident" where the pilots correctly responded to the stall warning, it was daylight and they could see the horizon, so they had a good sense of their angle of attack, independent of the lack of airspeed indication. The Air France flight happened in the middle the night, in IMC, so the pilots had no idea of their angle of attack.

    The Air France PF was much more concerned with overspeed, which has also crashed planes with frozen pitot tubes, than he was with stalls. An angle of attack indicator almost certainly would have prevented this crash.

  • Sara Jamms

    I read the BEA report and think Mark Wilson got a lot more right than he got wrong. First of all, the plane never gave any clear indication to the pilots that the source of the problem was invalid airspeed indicators. They knew some of the instruments were wrong but they didn't know which ones. The sad part of the story is that the computers knew what was wrong, but they didn't tell the pilots. That's exactly the kind of user interface and human factors design problem Marcus talking about.

    The autopilot and auto thrust automatically disengage with bad ADRs, so why doesn't the Flight Director automatically disengage?

    The stall warning was the worst. The thing to do with the stall is put the plane nose down. But the PF experienced that when he pulled back on the stick the stall warning stopped and when he pushed it forward the stall alarm went off again. No wonder he didn't trust it!

    There's no doubt that better UI would have prevented the crash. It's that simple.

  • Tim

    As a private pilot and aerospace engineer specialised in aircraft handling qualities I was glad to read Ari's comment after all the wrong conclusions of Mr. Wilson.

  • Ari

    This article shows why any usability person also needs to be a domain expert to make any competent suggestions or observations. I am a commercial pilot and a congitive scientist/UI expert, and can emphatically state that this article is wrong in every major detail, as well as most smaller ones.

    The crew made elementary mistakes for which I would have chastised a beginner student in his second "hood" training session (private pilot training for accidentally getting into low visibility situations). When they lost situational awareness they were doomed.

    You would also be surprised at how useless stick feedback is when you cannot see out. It really doesn't give any useful information. Airbus left it out for a good reason, and if feedback had been given then it would have been based on what the computer thought was correct (which would have been wrong feedback).
    The aircraft display industry is working hard at optimising cockpits and getting an optimal amount of displayed stuff, and finding ways of showing "process" information (i.e. showing how the system comes to conclusions and gives suggestions) which might in some cases show AoA (angle of attack) in the future.

    But an understanding of the aircraft, its systems, and basic procedure, are still the fundamentals. Without those then no amount of information will save you in an emergency. And that is what the pilots lacked. Possibly not their own fault (not entirely, anyway) but rather Air France's due to bad training practices and possibly hiring practices as well. By that I mean that airlines are mostly hiring systems managers who follow procedures instead of pilots with a passion for flying. See Captain Sullenberger (landed on the Hudson River) for examples of why people with flying skills and passion can be useful.

  • Sara Jamms

    Did you read the final BEA report? Mark isn't making wild claims based on a lack of understanding of flying aircraft, he's summarizing salient points from the official report by the accident investigation experts. You know, domain experts who spent over a year looking at all the information about the crash.

    BTW, Captain Sullenberger also thinks the lack of tactile feedback, and in particular the fact that the one pilot (PNF) can't tell what the other pilot (PF) is doing with the side stick, is a significant problem. He also harshly criticizes the lack of an angle of attack indicator, which alone most likely would have prevented this crash.

    Like any other system on the plane, pilots can be expected to fail occasionally. Good design should seek to minimize the danger associated with a system failure. Yes, the PF made a crucial mistake immediately after the autopilot failed. He pitched the nose up to 10°. Every other thing that went wrong on the plane and contributed to crash was a failure of the user interface to properly and clearly communicate to the pilots.

  • Nalliah Thayabharan

     Pilots lack of familiarity and training along with system malfunction contributed to this terrible accident.

    This Airbus A330-203 did not have multiple independent systems for detecting the airspeed such as a GPS based system that would at least cross check the readings being given by the pitot tubes and then provide a cockpit warning that the airspeed could be wrong, or another safety mechanism whereby the pitot tubes are heated so that ice could not occlude them. The pilots had a display of groundspeed. The pilots should keep an eye on that as part of their constant instrument scan.

    To recover from deep stall is to set engine to idle to reduce nose up side effect and try full nose down input. If no success roll the aircraft to above 60° bank angle and rudder input to lower the nose in a steep engaged turn. Practicing recovery from "Loss of Control" situations should be mandatory part of recurrent training of all pilots.

  • Mark Simchock

    The moral of the story is that UI and UX should be thorough and
    complete. Simple in and of itself is not a ends. It certainly can be a
    means but not if it sacrifices a thorough and complete end product.
    Simple might be a visual aesthetic but it's not necessarily an optimal
    user experience. 

  • Simon

    You wrote an article quoting a sensationalist piece by the telegraph as fact? There's so many inaccuracies in this article that it's clear that no-one with an iota of aviation knowledge fact checked it. Lousy would seem an appropriate description.

  • Bill Palmer

    The article appears to be written by someone who didn't really know what he was talking about. Using a Telegraph article as a source is absurd - did he not read the currently available BEA report himself?
    >>Though the plane was no longer stalling, it began the “stall” alerts again, in essence, causing Bonin to pull back on the stick and force a stall. << Hu?? So, you think Bonin was tricked into pulling back on the stick by the stall warning?  That doesn't make any sense at all. There were over a dozen industry incidents with the pitot tube  blockage  (please use proper terminology vs. "ducts') under these certain atmospheric conditions and no other aircraft suffered an upset. Other more competent pilots maintained their pitch and power and were back to normal usually in under a minute.Sorry to say, but jet airliners are meant to be flown by professional aviators with specific training in the make and model - hence the type rating. They are expected to understand the airplane and know to properly operate it. To be near one's maximum altitude and then pull back 15° is a total non sequitur. Holding back on the sidestick with the stall warning going off  - completely contrary to everything proper. (In the transcript it appears as though Bonin's reaction to the English voiced "stall stall" is "what's that?" (in french).  Yeah, that could indicate a little bit is a competence problem here.  Just because you don't like a particular design philosophy does not make it a design flaw. Indeed this "flawed" design philosophy will have prevented accidents that may have occurred in conventional flight control aircraft. 

    To put it simply, in the Airbus fly-by-wire (FBW)  design (also common with some other FBW types - and even close to space shuttle c* fly-by-wire control laws), you tell the airplane "what to do" not "how to do it."

    The author's conclusion, that Bonin " would have made different decisions with better feedback." is based on what, his own made up thorough evaluation of all the accident factors? HA!  There is no evidence to support that!

    [A330 captain, instructor/check pilot - also typed in A320, B777, DC10, 757/767]

  • Sara Jamms

    Did you read the BEA report, Bill? The conclusion that Bonin "would've made different decisions with better feedback" is straight from the BEA report.

    It clearly calls out the problem that the PF had with the stall warning. He pulls back on the stick, the stall warning stops; he pushes forward on the stick, the stall warning starts again. It really looked to the BEA like he was trying to put the plane nose down, but every time he did, the stall warning came on and he backed off.

    The #1 design problem was that nothing on the cockpit display directly told the pilots that the ADRs were bad. The #2 problem was that without airspeed, they could not determine angle of attack because there was no dedicated AoA indicator. Those are two user interface design failures. Fixing either one of them probably would have prevented the crash.

    Also, remember they were flying through heavy turbulence, so they couldn't just set the pitch and throttle and forget it, they had to actively fly the plane to keep the wings level.

  • I still don't quite understand why a pilot would believe that pitching the nose down could put you into a stall. Maybe Bonin had a kind of panic-induced brain-lock and essentially couldn't think independently of the signals the plane was giving him.

  • Robert Bramel

    Well stated Mark, however, I'd like to add that even with suboptimal design, adequate training can do wonders to compensate. Apparently neither the second or third officers (third being the pilot flying) had ever been trained in high altitude stall. That shortcoming seems so egregious it is hard to imagine how it could have happened! The NOVA TV show, released before recovery of the black boxes, stated that there is a specific procedure for crew to follow when computers shut down (something like set the throttles to 85 percent and the elevator angle to 5.) This approach never was implemented.
    To Matthew, the airbus pilot, I'd suggest that any pilot error falls squarely on the shoulders of AF. If three line pilots "incorrectly manipulated the controls..." what is wrong with AF that they didn't know the level of incompetence of this crew? As a crash investigator once observed decades ago, "attributing cause to pilot error is as correct and useless as saying the passengers died because of excessive g-forces". Neither provides useful information.

  • One other thing that is rarely commented on in these analyses (understandably in some cases as the info only came fully to light later): at least two of these pilots was short of sleep. DuBois admitted that he had only got an hour of sleep the night before and was pulled back shortly after going for what would have been his rest period. Robert told the others he hadn't managed to get much sleep during his rotation rest. We know that sleep-deprived people can take decisions and actions little better than those of somebody who is drunk. So we end up with a cockpit where two pilots have brains that are functioning below full capacity, and the third is the least experienced and is spooked by the conditions and by dropping out of autopilot. None of the three is operating properly mentally.

  • Sara Jamms

    Yes, training can compensate for bad design, but it's preferable for great design to compensate for errors the pilot might make.

    In this case, all of the errors the pilots made were due to misunderstanding the situation they were in. If that's not a failure of design I don't know what it is.

  • Matthew Tinnelly

    AS an airbus pilot, I find the casual nature of the conclusions drawn worrying at best and incorrect at worst. I have never felt 'out of the loop' flying an airbus. As has been mentioned before the key to this accident was the loss of situational rareness and the captain choosing to take his rest at a phase of flight which carried some of the greatest risks. I know a pilot who has encountered this exact problem, in the same aircraft type. This type of incident, while not an everyday occurrence, should never result in hull loss. As a pilot it is never nice to say (even though in the majority of cases the same is true) the cause of th accident was pilot error. The pilots incorrectly manipulated the controls on which they received the appropriate training every year.

  • Trock

    Cathay Pacific Flight 780Air France Flight 447A330 test flight crashPhilippine Airlines Flight 812
    4 Accidents in 20 years, very unsafe airplane. Get a grip, the crew was at fault in the AF incident. I have flown the A330 across the Pacific and the Atlantic and I find it 110% better than crossing the Atlantic in a 757.