Sully not only shows the importance of psychological safety in a crisis, it’s also a great illustration of how cognitive biases can negatively affect our thinking.
The movie recounts the real-life ‘Miracle on the Hudson’. After losing both engines from a bird strike 90 seconds after departing LaGuardia airport in 2009, Captain Chesley ‘Sully’ Sullenberger III is faced with a terrible dilemma. Attempt to reach the nearest available runway – and risk killing many civilians on the ground if unsuccessful. Or attempt a near-impossible water landing on the Hudson River.
Sully, of course, opts for the latter, and lands the plane without losing a single life. But much of the film focuses on the subsequent investigation; did Sully make the right decision? Here can see a great illustration of several cognitive biases at work. And by recognising them, we can help avoid them adversely affecting our own thinking.
The publication of Daniel Kahneman’s Thinking Fast and Slow in 2011 did much to raise awareness of cognitive biases. Kahneman described the many ways our brains automatically take shortcuts to avoid, well, doing too much work. While these shortcuts are sometimes helpful, they can often lead us to mistaken conclusions. Here are several at work in Sully.
Fundamental Attribution Error
Fundamental Attribution Error is our tendency to attribute people’s behaviour to their personality, rather than the situation they find themselves in. For example, we might believe an unsmiling waiter is unfriendly or rude, rather than simply overloaded and stressed by a busy restaurant.
In aviation, thorough and open accident investigations have been instrumental in making flying one of the safest forms of transport. But in his book Black Box Thinking, Matthew Syed explains how even experienced aviation investigators fall prey to this bias. When confronted with an accident, investigators are already piecing together possible explanations, even before the black box has been discovered. According to Syed, studies show that investigators’ first instinct is almost always to blame ‘operator error’.
You can see this in Sully. Despite landing the plane with no casualties, investigators are quick to direct blame at the pilot. First they doubt whether both engines really did fail. Then they question his judgement in deciding on a water landing rather than turning back. While it’s valid to ask these questions, fingers are pointed before all available evidence has been collected.
Hindsight Bias
Hindsight bias is the common tendency for people to perceive that past events were more predictable than they actually were. As Syed notes: “It is tough to put oneself in the shoes of the operator, often acting in high-pressure circumstances, trying to reconcile different demands, and unaware of how a particular decision might pan out.”
This bias surfaces strongly in the crash investigation. In the scene below, Charles Porter, the investigator, has arranged real-time simulations to determine whether Sully could successfully have reached an airport. But the simulation pilots are instructed to turn back the moment the birds hit the engines. No thinking or reaction time has been built into the simulation. As they know the outcome already, the pilots can react accordingly.
As Sully points out, they’re not acting as human beings – human beings who, in reality, had to react to an unknown and unprecedented situation.
The Curse of Knowledge
The Curse of Knowledge is similar to Hindsight bias. In essence, when we know something, we find it difficult to put ourselves in the position of someone who doesn’t know that information. It’s particularly prevalent when people becomes experts in a field: when communicating, they often assume others have the background knowledge to understand them.
In 1990 Stanford psychologist Elizabeth Newton ran a fascinating experiment to demonstrate this phenomenon. She split volunteers into two groups: tappers, and listeners. The tappers were asked to tap out the rhythm of well-known songs on the table. The listeners had to guess what the songs were.
Before the game, the tappers predicted the listeners would get 50% correct. But in the event, the listeners identified just three of the 120 songs tapped out – a mere 2.5%. As the tappers already had the tune in their head, they found it hard to imagine not being able to recognise it.
Sully, to his credit, doesn’t fall into this trap. In the scene above, he explains simply and persuasively why the simulations have been unrealistic. Having persuaded the audience, the investigators reluctantly allow an extra 35-second ‘reaction time’ to be added to the simulation. As a result, the simulation pilots fail on both occasions – proving Sully’s instincts were correct.
As humans we’re all subject to cognitive biases. But if we’re aware of what they are and how they can disrupt our thinking, we’re more likely to avoid them.