The Power of Mental Models: How Flight 32 Escaped Disaster
On a sunny morning in 2010, Qantas Airways Flight 32 taxied onto a runway in Singapore, asked for permission to start an eight-hour flight to Sydney, and took off into the bright sky.
The following is an excerpt from Charles Duhigg’s book Smarter , Faster, Better: Secrets of Productivity in Life and Business .
A few minutes after takeoff, pilot Richard de Crespigny activated the aircraft’s autopilot. However, when the plane reached 7,400 feet, the pilots heard a crash. Then there was another, even louder rumble, followed by what sounded like thousands of balls had been thrown into the enclosure.
A red alarm flashed on de Crespigny’s dashboard, and a siren blared in the cockpit. Investigators later determined that an oil fire inside one of the left-hand injectors caused the massive turbine disc to separate from the drive shaft, explode in three, and shoot outward, smashing the engine. Two larger debris from this blast punctured holes in the left wing, with one large enough for a human to pass through. Hundreds of smaller debris exploded like a cluster bomb and cut through electrical wires, fuel hoses, fuel tank and hydraulic pumps. The underside of the wing looked like it was being fired at from a machine gun.
The plane shook. De Crespigny reached out to reduce the aircraft’s speed, which is the standard response for such emergencies, but when he pressed the button, the auto-thrust did not work. Alarms began to appear on his computer screen. The second engine caught fire. The third engine was damaged. There was no data at all for engines 1 and 4. The fuel pumps were out of order. The hydraulic, pneumatic and electrical systems barely worked. Fuel escaped from the left fender with a wide fan. Later, this damage will be described as one of the most serious mechanical damage in the air of modern aviation.
De Crespigny radioed Singapore air traffic control. “QF32, engine two appears to be out of order,” he said.
Less than ten seconds have passed since the first explosion. De Crespigny cut off the power supply to the left wing and began fire protection protocols. The plane stopped vibrating for a moment. An alarm went off in the cockpit, and in the cabin, frightened passengers rushed to their windows.
The people in the cockpit began responding to the plane’s computers, talking to each other in short, effective sentences. De Crespigny glanced at his display and saw that twenty-one of the plane’s twenty-two major systems had been damaged or completely disabled. Working engines quickly deteriorated and the left fender lost the hydraulics that made steering possible. In a matter of minutes, the aircraft became capable of only the slightest changes in thrust and the slightest navigational adjustments. No one was sure how long he would stay in the air.
One of the copilots looked up from the control panel. “I think we need to turn back,” he said. It was risky to turn the plane around to return to the airport. But on their current course, they were moving away from the runway every second.
In recent decades, as computerized automation has increasingly infiltrated our workplaces and the information revolution has changed our lives, the importance of managing our attention has become even more important.
“You can think of your brain’s attention span as a spotlight that can expand and disperse or dense and focused,” David Strayer, a cognitive psychologist at the University of Utah, told me when I was writing my book on the science of productivity. Smarter, Faster, Better: The Secrets to Productivity in Life and Business . Our focus is determined by our intentions. In most situations, we choose whether to focus the spotlight or let it relax.
“But then, bam! Some kind of emergency happens – or you receive an unexpected email, or someone asks you an important question in a meeting – and suddenly the spotlight in your head has to suddenly go up, and at first, it doesn’t know where to shine, ”Strayer said.
Unless, of course, you have not learned to react.
In the late 1980s, a group of psychologists at the consulting firm Klein Associates began trying to figure out why some people are so good at staying calm and focused in chaotic environments – why some people, in other words, are better at controlling the spotlight in their heads. One researcher, Beth Crandall, began attending neonatal intensive care units or intensive care units. The ICU, like all ICUs, is a mixture of chaos and platitude amid the constant ringing of car horns and warnings. Many babies in the intensive care unit are on the path to full recovery; they may have arrived prematurely or suffered minor injuries during childbirth, but they are not seriously ill. Others are unwell and need constant supervision. However, for ICU nurses, a particular challenge is that it is not always clear which babies are sick and which are healthy. Seemingly normal premature babies can get sick quickly; sick babies can suddenly recover. So, nurses are constantly choosing where to focus their attention: the screaming child or the quiet one? New lab results or worried parents saying something is wrong? Crandall wanted to understand how nurses make decisions about which children need their attention, and why some of them focus better on what matters most.
Most interesting to Crandall were the handful of nurses who seemed especially gifted at noticing when a child was in trouble. They could predict the infant’s worsening or recovery based on little warning signs that almost everyone missed. Often, the clues these nurses relied on to spot problems were so subtle that they themselves had a hard time remembering what prompted them to take action. “As if they could see something that no one else could see,” Crandall told me. “They seemed to think differently.”
One of Crandall’s first interviews was with a talented nurse named Darlene, who described the change that had taken place a few years earlier. Darlene was walking past the incubator when she accidentally glanced at the baby inside. All devices connected to the baby showed that her vital organs were within normal limits. The infant was watched by another RN and she watched the infant closely, oblivious to what she saw. But Darlene felt something was wrong. The baby’s skin was slightly mottled rather than uniformly pink. The baby’s belly seemed a little bloated. Recently, blood was drawn from a prick on her heel, and the patch showed a crimson spot rather than a small dot.
Something about all these little things going on together caught Darlene’s attention. She opened the incubator and examined the baby. The newborn was conscious and awake. She winced slightly at Darlene’s touch, but didn’t cry. There was nothing specific she could point to, but this baby just didn’t look the way Darlene expected.
Darlene found an attending physician and said that she needed to start treating the child with intravenous antibiotics. All they had to do was Darlene’s intuition, but the doctor, relying on her judgment, prescribed drugs and a series of tests. When the labs returned, they showed that the baby was in early stages of sepsis, a potentially fatal whole-body inflammation caused by severe infection. The condition was so rapid that if they had waited longer, the newborn would most likely have died. Instead, she made a full recovery.
“I was amazed that Darlene and this other nurse saw the same warning signs, they had all the same information, but only Darlene discovered the problem,” Crandall said. “To the other nurse, the patchy skin and bloody band-aid were data points, nothing big enough to cause alarm. But Darlene put it all together. She saw the whole picture. ” When Crandall asked Darlene to explain how she knew the baby was sick, Darlene explained that she had a picture in her head of what a healthy baby should look like, and the baby in the crib, when she looked at her, didn’t know: t matches this image … Darlene’s focus was on the baby’s skin, the blood stain on her heel, and the swollen belly. He focused on these unexpected details and made Darlene feel uneasy. The other nurse, in contrast, had no clear idea of what she expected to see, so her attention was focused on the most obvious detail: the baby was eating. Her heartbeat was strong. She didn’t cry. Another nurse was distracted by information that is easiest to internalize.
People like Darlene, who are particularly adept at managing their attention, tend to have certain characteristics. One is the tendency to create in their minds pictures of what they expect to see. These people tell themselves stories of what happens as it happens. They talk about their own experiences in their head. They are more likely to answer questions with anecdotes than simple answers.
Psychologists have this expression for the usual forecasting: “creating mental models.” Understanding how people build mental models has become one of the most important topics in cognitive psychology. All people rely to one degree or another on mental models. We all tell ourselves stories about how the world works, whether we realize it or not.
But some of us create more robust models than others. We are more specific about the conversations we are going to have, and we are more specific about what we are going to do later in the day. As a result, we are better at choosing what to focus on and what to ignore.
Even before Captain Richard Champion de Crespigny stepped aboard Flight 32 Qantas, he was teaching his crew the mental models he expected them to use.
“I want us to imagine the first thing we will do if there is a problem,” he told his co-pilots as they rode the van from the Fairmont to Singapore’s Changi Airport. “Imagine an engine failure has occurred. Where do you look first? “The pilots took turns telling where they would turn their eyes. De Crespigny had the same conversation before each flight. His co-pilots knew what to expect. He asked them what screens they would look at during an emergency, where they would go. hands, if an alarm goes off, will they turn their heads to the left or look straight ahead. “The reality of a modern aircraft is that it is a quarter of a million sensors and computers that sometimes cannot distinguish between garbage and common sense,” de Crespigny later told me He’s a tough Australian, a cross between Crocodile Dundee and General Patton. “That’s why we have human pilots. Our job is to think about what’s going to happen, not what’s already there.”
After the crew imaging session, de Crespigny laid down some rules. “Everyone has a responsibility to tell me if you disagree with my decisions or think I’m missing something.”
“Mark,” he said, pointing to the co-pilot, “if you see everyone looking down, I want you to look up. If we are all looking up, you are looking down. We are all likely to make at least one mistake on this flight. Each of you is responsible for capturing them. “
So when pilots flying the Qantas 32 began to see emergency warnings appear on their dashboards, they were somewhat prepared. Twenty minutes after the turbine disc punched a hole in the wing, the people in the cockpit were faced with a growing number of alarms and emergencies. The plane’s computer displayed step-by-step solutions to each problem. Men relied on mental models they had developed in advance to decide how to respond. But as problems with the aircraft cascaded, the instructions became so cumbersome that no one knew how to prioritize or what to focus on. De Crespigny felt depressed. One computerized checklist instructed pilots to pump fuel between the wings to balance the plane’s weight. “Stop!” de Crespigny shouted as the co-pilot reached out to obey the command of the screen. “Should we pump fuel from a good right wing to a leaking left wing?” A decade earlier, a plane in Toronto nearly crashed after the crew accidentally drained fuel into a leaky engine. The pilots agreed to ignore the order.
De Crespigny slumped into a chair. He tried to visualize the damage, trying to keep track of the declining variants, trying to build a mental picture of the plane as he learned more and more about what was wrong. During this crisis, de Crespigny and other pilots built mental models of Airbus in their heads. However, wherever they looked, they saw new alarms, another system failure, new flashing lights. De Crespigny sighed, removed his hand from the control panel and placed it on his lap.
“Let’s be simple,” he told his co-pilots. “We cannot pump fuel, we cannot throw it away. The fuel in the trim tank is stuck in the tail and the transfer tanks are useless.
“So forget about the pumps, forget about the other eight tanks, forget about the total fuel gauge. We need to stop focusing on what’s wrong and start paying attention to what’s still working. ”
At a signal, one of the copilots began to note things that were still in working order: two of the eight hydraulic pumps were still working. There was no electricity in the left wing, but there was some power in the right wing. The wheels were intact, and the co-pilots believed that de Crespigny could have applied the brakes at least once before they failed.
The first airplane de Crespigny ever flew was the Cessna, one of the single-engine, almost non-computer airplanes that hobbyists loved. The Cessna is, of course, a toy compared to the Airbus, but every plane has essentially the same components: fuel system, flight controls, brakes, landing gear. What if, de Crespigny thought to himself, I imagine this plane as the Cessna? What would I do then?
“This moment is truly a turning point,” Barbara Burian, a NASA research psychologist who studied Qantas Flight 32, told me. “In most cases, when information overload occurs, we are not aware that this is happening, and therefore it is so dangerous. So really good pilots force themselves to do a lot of what-if exercises before the competition by going over the scenarios in their head. So in the event of an emergency, they have models that they can use. “
In other words, de Crespigny was willing to change the mental model he relied on because he knew that the models he had developed in advance were not sufficient for the task at hand. De Crespigny asked one of his co-pilots to calculate how many runways they would need. In his head, de Crespigny imagined the landing of the oversized Cessna. “This performance helped me simplify my life,” he told me. “I had a picture in my head with the basics, and that’s all I need to get the plane to land.”
According to the co-pilot, if de Crespigny hits everything correctly, the plane will need 3900 meters of asphalt. The longest runway in Singapore Changi is 4,000 meters. If they miss, the ship will buckle as its wheels hit grassy fields and sand dunes.
“Let’s do it,” de Crespigny said.
The plane began to descend towards Singapore Changi Airport. At two thousand feet, de Crespigny looked up from the panel and saw the runway. At a thousand feet, an alarm inside the cockpit began screaming, “SPEED! SPEED! SPEED! “The plane was on the verge of stalling. De Crespigny’s eyes darted between the runway and the speed indicators. He could mentally see the Cessna’s wings. He gently pushed the throttle, increasing the speed slightly, and the alarm went off. He lifted his nose slightly, because that’s what the picture in his head told him.
“Confirm that the fire brigades are on standby,” the co-pilot said over the radio to the control room.
“I confirm we have emergency services on standby,” a voice replied.
The plane descended at fourteen feet per second. The maximum certified speed that the undercarriage could withstand was only twelve feet per second. But there were no other options now.
FIFTY, said the computer voice. “FOURTY.” De Crespigny drew back his club slightly. “THIRTY… TWENTY.” A metallic voice rang out: “TABLE! STALL! LAREC! ”The Cessna, in de Crespigny’s mind, was still floating towards the runway, ready to land as he had done hundreds of times before. It didn’t slow down. He ignored the alarm. The rear wheels of the Airbus touched the ground, and de Crespigny pushed the club forward causing the front wheels to fall on the asphalt. The brakes were applied only once, so de Crespigny pressed the pedal all the way and held it. We passed the first thousand meters of the runway. At two thousand meters, de Crespigny thought they were slowing down. End takeoff -the runway raced towards them through the windshield, the grass and sand dunes grew larger as it approached. As the plane neared the end of the runway, the metal began to groan. The wheels left long skid marks on the asphalt. Then the plane slowed down, shuddered and stopped a hundred meters before the reserve.
Investigators later found Qantas Flight 32 to be the most damaged Airbus A380 ever to land safely. Several pilots tried to recreate de Crespigny’s recovery on simulators and failed each time.
When Qantas Flight 32 finally came to a stop, the lead flight attendant activated the aircraft’s announcement system.
“Ladies and gentlemen,” he said, “welcome to Singapore. It’s five minutes to noon local time on Thursday, November 4th, and I think you’ll agree that it was one of the nicest landings we’ve ever seen. ” De Crespigny returned home a hero. Today, Qantas Flight 32 is taught in flight schools and psychology classes as an example of how to stay focused during an emergency. This is cited as one of the prime examples of how mental models can bring even the most dire situations under our control.
Mental models help us by providing the basis for the flow of information that constantly surrounds us. Models help us choose where to direct our attention so that we can make decisions rather than just react. We may not realize how similar the situations in our life are to what happens in the cockpit of an airplane. But think for a moment about the pressures you face every day. If you’re in a meeting and the CEO suddenly asks you for your opinion, your mind is likely to shift from passive listening to active participation, and if you’re not careful, the cognitive tunnel may prompt you to say things you regret. If you’re dealing with multiple conversations and tasks at the same time and an important email arrives, reactive thinking can force you to type in your answer before you really think about what you want to say.
So what’s the solution? If you want to better pay attention to what’s really important, not get distracted or distracted by the constant stream of emails, conversations and interruptions that are part of everyday life, to know what to focus on and what to ignore, get in the habit of telling yourself stories. … Share your life, how it is, and then when your boss suddenly asks you a question during a meeting or an urgent note comes in and you only have a few minutes to respond, the spotlight in your head will be ready to shine in the right way.
To be truly productive, we must control our attention; we must build mental models that hold us firmly responsible.
“You cannot delegate thinking,” de Crespigny told me. “Computers crash, checklists don’t work, things can crash. But people cannot. We must make decisions, including deciding what deserves our attention. The key is forcing itself to think. As long as you think you’re halfway home. “
From the book “Smarter, Faster, Better” by Charles Duhigg. Copyright © 2016 Charles Duhigg. Reprinted in agreement with Random House, published by The Random House Publishing Group, a division of Penguin Random House, Inc. All rights reserved. Illustration by Sam Woolley.