Insight
A Conversation with Gary Klein [7.2.11]Judgments based on intuition seem mysterious because intuition doesn't involve explicit knowledge. It doesn't involve declarative knowledge about facts. Therefore, we can't explicitly trace the origins of our intuitive judgments. They come from other parts of our knowing. They come from our tacit knowledge and so they feel magical. Intuitions sometimes feel like we have ESP, but it isn't magical, it's really a consequence of the experience we've built up.
INTRODUCTION
By Daniel KahnemanWe are prone to think of the British as snobbish, a label that is rarely used to describe Americans. When it comes to the adjective "applied," however, the tables are turned. The word "applied" does not have any pejorative or diminishing connotation in Britain. Indeed, the Applied Psychology Unit on 15 Chaucer Road in Cambridge was for decades the leading source of new knowledge and new ideas in cognitive psychology. The members of that Unit did not see their applied work as a tax they had to pay to fund their true research. Their interest in the real world and in theory merged seamlessly, and the approach was enormously productive of contributions to both theory and practical applications.
In the US, the word "applied" tends to diminish anything academic it touches. Add the word to the name of any academic discipline, from mathematics and statistics to psychology, and you find lowered status. The attitude changed briefly during World War II, when the best academic psychologists rolled up their sleeves to contribute to the war effort. I believe it was not an accident that the 15 years following the war were among the most productive in the history of the discipline. Old methods were discarded, old methodological taboos were dropped, and common sense prevailed over stale theoretical disputes. However, the word "applied" did not retain its positive aura for very long. It is a pity.
Gary Klein is a living example of how useful applied psychology can be when it is done well. Klein is first and mainly a keen observer. He looks at people who are good at their job as they exercise their skills, sometimes in life-and-death situations, and he reports what he sees in clear and eloquent prose. When you read his descriptions of real experts at work, you feel that it is the job of theorists to accommodate what he has seen – instead of doing what we often do, which is to scan the "real world" (when we think of it at all) for illustrations of our theoretical notions. Many of us in cognitive and social psychology are engaged in the important exercise that Lee Ross has wonderfully described as "bottling phenomena" and our theories are built to fit what we bottle. Klein himself is a theorist as well as an observer, but his theoretical ideas start from what he does for a living: they are intended to describe and explain a large chunk of behavior in a context that matters. It is instructive to note which of the concepts that are current in academic psychology turn out to be useful to him.
Klein and I disagree on many things. In particular, I believe he is somewhat biased in favor of raw intuition, and he dislikes the very word "bias." But I am convinced that there should be more psychologists like him, and that the art and science of observing behavior should have a larger place in our thinking and in our curricula than it does at present.
— Daniel Kahneman
GARY KLEIN is s a research psychologist famous for his work in pioneering the field of naturalistic decision making. Among his books are Sources of Power and Intuition.
DANIEL KAHNEMAN, Eugene Higgins Professor of Psychology, Princeton University, is recipient of the 2002 Nobel Prize in Economics. He is the author of Thinking Fast and Slow.
INSIGHT
[GARY KLEIN:] What's the tradeoff between people using their experience (people using the knowledge they've gained, and the expertise that they've developed), versus being able to just follow steps and procedures?
We know from the literature that people sometimes make mistakes. A lot of organizations are worried about mistakes, and try to cut down on errors by introducing checklists, introducing procedures, and those are extremely valuable. I don't want to fly in an airplane with pilots who have forgot their checklists, and don't have any ways of going through formal procedures for getting the planes started, and handling malfunctions, standard malfunctions. Those procedures are extremely valuable, and I don't doubt any of that. The issue is how does that blend in with expertise? How do people make the tradeoffs when they start to become experts? And does it have to be one or the other? Do people either have to just follow procedures, or do they have to abandon all procedures and use their knowledge and their intuition? I'm asking whether it has to be a duality. I'm hoping that it doesn't and this gets us into the work on system one and system two thinking.
System one is really about intuition, people using the expertise and the experience they've gained. System two is a way of monitoring things, and we need both of those, and we need to blend them, and so it bothers me to see controversies about which is the right one, or are people fundamentally irrational, and therefore they can't be trusted? Obviously system one is marvelous. Danny Kahneman has put it this way, "system one is marvelous, intuition is marvelous but flawed." And system two isn't the replacement for our intuition and for our experience, it's a way of making sure we don't get ourselves in trouble.
If we eliminate system one, system two isn't going to get the job done because you can't live by system two. There are people who try, there are people who have had various kinds of brain lesions that create disconnects between their emotions and their decision-making process. Damasio has written about them. It can take them 30 minutes to figure out what restaurant they want to go to. Their performance on intelligence tests isn't impaired, but their performance in living their lives is greatly impaired; they can't function well, and their lives go downhill.
So we know that trying to do everything purely rationally, just following Bayesian statistics or anything like that isn't going to work. We need both system one and system two, and so my question is what are the effective ways of blending the two? What are the effective ways that allow people to develop expertise, and to use expertise while still being able to monitor their ideas, and monitor their actions?
Too often it's treated as a real dichotomy, and too many organizations that I study try to encourage people to just follow procedures, just follow the steps, and to be afraid to make any mistakes. The result is that they stamp out insights in their organization. They stamp out development of expertise in their organization, and they actually reduce the effectiveness and the performance of the organizations. So how do you blend those is an issue.
A project that I'm working on now is about what to do with children who are in dangerous situations in home environments where a Child Protective Service worker has to judge what's the potential for abuse. Do you leave the child in with the parents who do damage to the child, or do you remove the child for its own protection, which creates its own set of problems.
A complicating factor here is that a lot of the Child Protective Service workers are not all that well paid, very few of them are well paid, there is a lot of turnover, they don't develop expertise, so there is a temptation to turn it all over to checklists, and to say, "Here are the factors, these are the things to go through, these are the objective criteria." Some of the criteria are useful and need to be taken into account, but there must also be an aspect of empathy, of a caseworker looking at the situation and saying, "This doesn't feel right to me. There is an edge to the way the parents are interacting with the child, there is a feeling of hostility I'm picking up, there is a feeling of menace in the way the parents are acting, and the way the child is acting", and I don't think we want to lose that. I'm afraid that the temptation to try to procedurize and checklist everything can get in the way of those kinds of insights, and those kinds of social concerns that seem so important.
I started moving in this direction when I began working as a research psychologist for the Air Force, at Wright-Patterson Air Force Base. I worked for an organization called the Air Force Human Resources Laboratory starting in 1974. I had been in academia, and I enjoyed that, but this was an opportunity to do something different. It was a wonderful time to be at Wright-Patterson and in my office because the Arab oil embargo had hit in 1973, and the price of fuel went sky high, and jet fuel went sky high, and all of a sudden these pilots who were used to doing all their training by just flying, because they loved to fly, that's what really got them so excited, now they were going to have to develop skills and be trained in simulators. My unit was studying simulation training. Up to that point pilots really didn't care much about what the unit did because they just wanted to fly. Now it mattered because they were going to spend a lot of time in simulators building their skills, and so this meant a lot to them.
For me, it was a chance to confront some basic issues about the nature of expertise because we were trying to develop expertise in the pilots in an artificial setting, which is not as good as being able to fly. But in some ways it's better because you can practice certain maneuvers, you can confront them with certain malfunctions that you can't do in the air. It opened up the question, what is the nature of expertise, how does it develop, how can we use these kinds of devices to develop the pilots more effectively?
We did that for a couple of years, and as I was doing that, a part of the expertise that intrigued me was how people make tough judgments, and how they make decisions? I wanted to get involved in that, I wanted to study that. There wasn't any place for doing it in my branch, so I left and started my own company in 1978 to examine those sorts of issues, and struggled for a while to try to see where I was going to go with this to get funding.
Then in 1984, a notice came out from the Army Research Institute asking for proposals about how people make life and death decisions under extreme time pressure and uncertainty. We said, "That sounds like the sort of thing that we wanted to get involved with." We knew that the standard way you do that kind of research is you pick a laboratory task, a well-studied, well-understood task, you vary the time pressure, you vary the uncertainty, and you see the effect. But we didn't want to do that because according to all the literature we had seen, it should be impossible to make good judgments and good decisions in less than a half hour. Yet we know people can do it and we didn't know how.
Rather than perform standard research and manipulate variables, we said, "Let's talk to the experts." We found some experts; we found firefighters. We decided we would study firefighters because that's what they do for a living, that's how they've trained themselves, that's how they develop themselves, and because we figured when they weren't fighting fires, they would have time to talk to folks like us. There was another advantage that we didn't know at the time, which is that firefighters are a wonderful community to work with. They're very friendly, they're very open to being helpful. We went in, and said, "The Army wants to do a better job of preparing people to make tough decisions. You guys are the experts, are you willing to share what you've learned?" And they said, "We'd love to help." They were marvelously cooperative.
That's the way we began the research. We got funded, and then we had to go out and see what the firefighters were doing. We started out in a wrong direction. At first, we thought we would just ride along with them. But they don't have enough interesting two-alarm, three-alarm fires, and we were going to waste all of our energy and all of our funds sitting around in fire stations, so we said, "Let's not do that, let's interview them about the tough cases that they've had."
I remember the first firefighter I interviewed, this was just a practice interview, and I said,"We're here to study how you make decisions, tough decisions."
He looked at me, and there was a certain look of not exactly contempt, but sort of condescension, I would say at least, and he said,"I've been a firefighter for 16 years now. I've been a captain, commander for 12 years, and all that time I can't think of a single decision I ever made."
"What?""I don't remember ever making a decision."
"How can that be? How do you know what to do?"
"It's just procedures, you just follow the procedures."My heart sank, because we had just gotten the funding to do this study, and this guy is telling me they never make decisions. So right off the bat we were in big trouble. Before I finished with him, before I walked out, I asked him, "Can I see the procedure manuals?"
Because I figured maybe there's something in the procedure manuals that I could use to give me an idea of where to go next. He looked at me again with the same feeling of sort of condescension, (obviously I didn't know that much about their work) and he said,
"It's not written down. You just know."
"Ah, okay, that's interesting."Something was going on here. It feels like procedures to them, but it's not really procedures, and it's not that they're following the steps of any guide, or any set of checklists. We conducted a few dozen interviews to what people were doing, and we collected some marvelous stories, and some really very moving stories from them about how they made life and death decisions. What we found was that they weren't making decisions in the classical sense that they generated a set of options, and they looked at the strengths and the weaknesses of each option, and then they compared each option to all the others on a standard set of dimensions. I mean, that's classical management-type decision-making, get your options, A, B, and C, get your evaluation dimensions, rate each option on each dimension, see which comes out ahead. They weren't doing that.
When I asked him, "How do you make decisions?" That's what he assumed I was asking about, and they never did it, and that's why he gave me that response. Others gave me that response, and pretty quickly we stopped even asking them about how they made "decisions". Instead, we would ask them about "tough cases", cases where they had struggled, cases where they maybe would have done things differently, cases where they might have made mistakes earlier in their careers. We were asking it that way to get away from the term "decision-making", because that was just leading us in the wrong direction.
I'll give you a few examples. One example, a firefighter insisted he never made any decisions and so I didn't even push it any further.
I said, "Tell me about the last run you went on."And he said, "It was about a week ago."
"Tell me what happened."
I figured maybe we could learn something from it. He said "it was a single-family house, and we pull up, and I see there is some smoke coming out from the back".
"Right away", he says, "I figure it's probably a kitchen fire. That's a typical thing that goes on here. And I know how to handle kitchen fires."He starts to walk around the house, but he's pretty sure he knows what's going on, so he sends one of his crew in with an inch and three quarter line. He says to the men, "Go into the house, go in the back, and get ready, and when you reach the fire, just knock the fire down." That's what they did, and they put the fire out.
He said, "You see, there was no decisions, it was all just standard."
I said, "You know, I've always been told if a house is on fire, you go out of the house. And you just told me you sent your crew into the house. Why did you do that? Why didn't you just go around the back, break a window, and use your hoses from the back and hit the fire that way? That way none of your crew is inside a burning house. That seems to be the way I'd do it."
He looked at me with a little bit of condescension (I get a lot of condescension in this line of work) and he said, "Hm, yeah, maybe some volunteer fire companies might do it that way, but it's a bad idea. Let me tell you why. If you do it that way, what are you doing? You're pushing the fire back into the house, and now you're spreading it further into the house, and we don't want to do that. That's why we want to go into the house, get to the fire, and then drive it out. We want to hit it with water, and all the momentum, all the direction is pushing the fire out of the house, and not into the house. Now, there are times when we can't do that kind of an interior attack, like if there is another house right next to it that we could set on fire, then we would do it externally. But in this case, there weren't any complicating factors."I said, "Thank you, thank you."
A simple situation. But, in fact, he had encountered a decision. Do you do an interior attack, or an exterior attack? There was a decision point. He didn't experience it as a decision point because for him it was obvious what to do. That's what 20 years of experience buys you. You build up all these patterns, you quickly size up situations, and you know what to do, and that's why it doesn't feel like you're making any conscious decisions because he's not setting up a matrix. But that doesn't mean that he's not making real decisions because the decision I would have made, he thought was a bad choice in this particular situation.
That became part of our model -- the question of how people with experience build up a repertoire of patterns so that they can immediately identify, classify, and categorize situations, and have a rapid impulse about what to do. Not just what to do, but they're framing the situation, and their frame is telling them what are the important cues. That's why they're always looking, or usually looking, in the right place. They know what to ignore, and what they have to watch carefully.
It's telling them what to expect, and so that's why performance of experts is smoother than the performance of novices, because they're not just doing the current job, they know what to expect next, so they're getting ready for that. It's telling them what are the relevant goals so that they can choose accordingly.
Sometimes you want to put a fire out, and sometimes the fire has spread too much and you want to make sure it doesn't advance to other buildings near by, or sometimes you need to do search and rescue. They've got to pick an appropriate goal. It's not just put the fire out each time.
It's also telling them what to expect, and by the way, when they think about what to expect, that gives them another advantage, because if their expectancy is violated, that's an indication, "Maybe I sized it up wrong. Maybe my situation awareness is wrong, maybe the way I've made sense of it is leading me in a wrong direction, I framed it in the wrong way, and I've got to rethink it."
We saw many examples where they would be surprised because their expectancy would be violated, and that would stop them in their tracks. That's the importance of a frame; it gives you expectancies so that you can be surprised. The frame is also telling them what's the reasonable course of action.
But that's only part of the decision-making story, because that gives you your initial intuitive impulse about what to do. How do you evaluate that course of action?
When I started the research, I always assumed that the way you evaluate a course of action is you compare it to other courses of action, like in the standard model, to see which one is better. But, these firefighters were making decisions in just a few seconds, not enough time to evaluate by comparing to other options. Besides, they were telling us that they didn't ever compare it to other options. So what was going on there?
We looked at some of the cases, and we got an indication of what was happening.
I'll give you an example. A harness rescue example is one that I learned a lot from:
A firefighter gets called out, there's an emergency. It seems that a woman has decided to kill herself. She went to an overpass over a highway, way high over the highway, and she jumped off to try to kill herself, but she missed. Instead of falling to her death, which was her plan, she just fell a little bit, and she landed on the struts that were holding up one of the highway signs, and she reflexively grabbed onto the strut. The firefighters were called in, and the emergency rescue squad was called in to try to save her. They pull up, and they see her there, and now the fire commander has to figure out "How am I going to do this rescue?"
In the meantime, another piece of equipment, a hook and ladder truck had been called in, and they radioed to him,
"How do you want us to help you?"He didn't want to think about them, he just wanted to get them out of his hair, so he said,
"Go drive down to the highway below, block the highway. In case the woman falls, we don't want her falling on top of a car, killing a motorist, you just block them."
Now he's got to figure out how to make the rescue. The standard way firefighters make a rescue is a Kingsley harness, you snap it onto the shoulders, you snap it onto the thighs, and you lift the person up to safety. You've seen it on TV, this is how they make rescues, and get people out of dangerous situations. They lower the Kingsley harness, the person snaps it on, and they lift them up to safety. It wasn't going to work here because this woman was also either on drugs, or alcohol, or both. She was semi-unconscious.
I forgot to tell you one other part of this story: as soon as he got there, the commander told his crew,
"Nobody go out there, because it's too risky, it's too dangerous."Two of his crew members immediately ignored him. They climbed out to try to protect her. One was sort of holding her legs, one was holding her arms. The commanders was glad that they were doing that because it was keeping her secure, and if anything happened to them, everybody had heard him tell people not to do it. So he was sort of covered either way. Now he's got two firefighters with her, and he thought,
"Can we have them lift her up to a sitting position and then snap on the Kingsley harness?" But they're also balancing on those struts. There's not going to be any easy way for them to lift her up without making it really risky for all of them. So he rejects that option, that's not going to work.Then he says, "You know, maybe we can attach the Kingsley harness from behind." She was face down.
So he thinks , "We can try that."
But then he imagines what would happen as the lifted her, and he imagines the way her back would sag, and it was a painful vision, a painful image, and he thought
"Too much of a chance we're going to do damage to her back, that's a bad option."
He rules that out. He thinks of another few options, and when he imagines each of them, they're not going to work, so he rejects all of them.
Then he has a bright idea, he has a clever idea. What about a ladder belt? A ladder belt is what firefighters wear. Ever see on TV a firefighter on a second or third floor, and they've got a hose, or they're helping people onto a ladder? The ladder is up two or three stories high. They're at the top.
You think, "What if these guys fall?"But what happens is they have a ladder belt that they've cinched up around their waist, and when they get to the top, there's a snap, and they snap it to the top wrung, so they're well-secured.
The commander thinks, "We can just get a ladder belt. We can just lift her an inch, slide the ladder belt underneath her, buckle it from behind. We have a rope, we'll tie a rope to it, and we lift her up, and we can lift her to safety that way."
And he imagines it, he does what we call "a mental simulation."He sort of works it through in his head to see if there will be any problems, and he can't think of any. So that's the way he makes a decision to do the rescue.They lower the ladder belt; the firefighters slide it underneath her and tighten it up. In the meantime, the hook and ladder truck below is getting bored because they're just standing there, they're not doing anything. So they put somebody in the ladder, and they're raising it to do the rescue, and they're saying, "We're getting there, we're going to be able to rescue her."
So there's a race because he wants to get her rescued before they get there.
They cinch up the ladder belt, they tighten the buckle, and they start to lift, and that's when he realizes what was wrong with his mental simulation. Because a ladder belt is built to fit over a sturdy firefighter on the outside of their coat, which is a protective coat, and she was a slender woman. She was wearing a very skimpy sweater, and the firefighters tightened it to the last hole as tight as they could make it, and it wasn't tight enough. As they were lifting up, and I'll never forget his phrase, he said, "She slithered through like a slippery strand of spaghetti." She's sliding through. But the ladder people from below were right there, and as she's sliding through, they're rushing over and maneuvering the ladder that they were raising, and they catch her as she's sliding through, they make the rescue, and the woman is saved. So it worked, but not the way he expected.
What did we learn from that episode, that incident?
We learned about how he did the evaluation. He looked at several options, but he never compared them to see which was the best one compared to the others. He wanted the first one that would work, and he did this mental simulation. He did this imaging process for each one. That was the way he could evaluate one option at a time, not by comparing it to others, but by seeing if it would work in this particular context. Even though this didn't have the outcome he expected, that turned out to be another part of the decision-making strategy that the firefighters were using.
It's really a two-part strategy. The first part is the pattern matching to get the situation framed about what to do. Then the second part is this mental simulation to be able to evaluate and monitor an option to make sure that it will do a good job, and to use your experience. They would use their experience to do the mental simulation. In this case, the firefighter, his experience didn't go far enough, and so his mental simulation didn't work as well as he would have liked. Generally, though, it's a very powerful technique.
What I've described about their strategy is about how they use their intuition, because they're not making formal decisions, they're not making analytical decisions by comparing options. These are intuitive decisions, and by intuition here, I'm talking about the way they are able to use their experience. This isn't just "top of my head, this feels good" type of decisions. These are intuitions that are based on 10, 15, 20 years or more of experience that has allowed them to build a repertoire of patterns that allows them to quickly frame situations, size situations up, and know what to do. But they're not just using intuition, they're balancing the intuition with the mental simulating part, which is how they're doing the analysis. So their decision-making, we call it recognition primed decisions. The decisions are primed by their ability to recognize situations, and balanced by the monitoring of the mental simulation.
The concept of intuition gives a lot of people pause because intuition feels magical, it feels like Luke Skywalker getting in touch with The Force, and we're not talking about that, about people somehow drawing on pyramid power, or something occult, or ESP. Although I've had a couple people I've interviewed tell me they made decisions because of ESP, because they couldn't articulate the basis of the decisions.
Intuition is about expertise and tacit knowledge. I'll contrast tacit knowledge with explicit knowledge. Explicit knowledge is knowledge of factual material. I can tell people facts, I can tell them over the phone, and they'll know things. I can say I was born in the Bronx, and now you know where I was born. That's an example of explicit knowledge, it's factual information.
But there are other forms of knowledge. There's knowledge about routines. Routines you can think of as a set of steps. But there's also tacit knowledge, and expertise about when to start each step, and when it's finished, when you're done and ready to start the next one, and whether the steps are working or not. So even for routines, some expertise is needed.
There are other aspects of tacit knowledge that are about intuition, like our ability to make perceptual discriminations, so as we get experience, we can see things that we couldn't see before.
For example, if you ever watch the Olympics and you watch a diving competition, the diver goes off the high board, and the TV commentators are there and the person didn't do a belly flop, dove in, looks clean, and they're saying, "Look at the splash. The splash was bigger than it should have been, the judges are sure to catch that", and what happened was the diver's ankles came apart just as she was entering the water. Then they show it in instant replay, and sure enough, that's what happened. But the commentator saw it as it happened. To a viewer like me, that's invisible. I just saw the dive. But they know where to look, and they know the probable trouble spots, or they know the difficult aspects. That's part of the patterns that they've built up -- to know how to direct their attention so they can see the anomalies. They see it as it's happening, not in replay. You can't tell somebody over the phone what to look for. You can say it after the fact, but they see it while it's going on. That ability to make fine discriminations is a part of tacit knowledge, and a part of intuitive knowing.
Another part is pattern recognition. If you go to a friend's house, and the friend for some reason has an album out, and there's a picture from when they were in the fourth grade, you can look at the picture, and you look at all the faces, and you say, "That's you, isn't it?" And most of the time you get it right. Now, the face doesn't look like the face of your friend right now, but we see the facial resemblance, we see the relationship of the eyes, and the eyebrows, and the nose, and all of that. We just have a pattern recognition that we're able to apply. That's another aspect of tacit knowledge
A third aspect, if we have a lot of experience and we see things, we can sense typicality, that means we can see anomalies, and that means that we have a sense something is not right here, something doesn't feel right. And then we start to look for the specifics about what it is that's gone wrong, and that's another aspect of tacit knowledge that we depend on to alert us to possible danger.
Another aspect of tacit knowledge is our mental models of how things work. Mental models are just the stories, the frames that we have to explain causal relationships: if this happens, that will happen, and that will happen, and we build these kinds of internal representations, these mental models about how things work.
A lot of people in New York have much more sophisticated mental models about the way the financial system works than they did back in 2006. After the meltdown in 2007 and 2008, there's a much better sense of where Wall Street comes in, how it helps, how it interferes, how perverse incentives come into play. People are much more sophisticated about the interplay of those kinds of forces. They've learned about the forces, and how they connect to each other. They can't tell you, they can't draw an easy diagram, but there's a level of sophistication that many people have that they didn't have before, and that's another aspect of tacit knowledge.
Judgments based on intuition seem mysterious because intuition doesn't involve explicit knowledge. It doesn't involve declarative knowledge about facts. Therefore, we can't explicitly trace the origins of our intuitive judgments. They come from other parts of our knowing. They come from our tacit knowledge and so they feel magical. Intuitions sometimes feel like we have ESP, but it isn't magical, it's really a consequence of the experience we've built up.
Moving forward from the work on decision-making, a lot of our decision depends on sense-making, how we size situations up. As I've been studying sense-making, I've become interested in how people realize that there is a new way to size things up, how they form insights, and where insights come from. I've been looking at incidents where people generated insights in an attempt to try to see what was going on. I realize that, again, this is certainly an aspect of tacit knowledge because an insight many times will spring forward with no warning. There's no expectation, and all of a sudden you say, "Now I know what to do." So where does that sense come from, and what can get in its way? Those are the sorts of things that I've been investigating. I've been looking at different forms of these insights to get a better idea of whether it's always the same sort of process, or are there several related processes.
I'm finding it's the latter. There are several different routes for people to develop insights. A lot of the laboratory research on insights follows one of those routes: putting people in a position where they're trying to solve a puzzle and they reach an impasse, and they're stuck, and then the key for that route is to escape the fixation, to reach the insight by realizing that they're making an inappropriate assumption. So it's really looking for the assumption that isn't working for me here, trying to find what is, what is the assumption that's fixating me so that I can get beyond it. But that's only one of the routes that we're discovering. We're finding that there are a few other routes that are also important.
As we're looking at these examples and incidents of insight, we're noticing that several of the examples involve people helping others to gain insight, and that caught my attention. How can you help other people to gain an insight? One of the ways that allows people to do that is to help them to become aware of inconsistencies in their thinking.
Let me give you an example: It's one of my favorite examples.
A friend of mine, Doug Harrington, Senior, a number of years ago was a Navy pilot. He was a good Navy pilot, and being able to fly a jet plane is a tough job, but he not only could fly, I mean, the Navy pilots, they not only fly, but they have to be able to take off and land on an aircraft carrier, so they have to land on a ship, and the ship is bobbing up and down, and moving around, and the waves are buffeting it. It's not like a rowboat, but there's still some movement there, and you've got to land on this moving platform. Doug could do that, he was great at it. He was an instructor, and he flew F-4s. As an instructor he would teach younger pilots to land F-4s on aircraft carriers, and to fly.
Then came a point in his career when it was time for him to move to a more sophisticated, more advanced airplane, an A-6. So he learned how to fly an A-6. No problem, because he's a very skilled pilot. Now came the day when he had to do his aircraft carrier landings with his A-6. And he comes around, he's lined up to do his first landing, he's all set, and because these are so difficult, he doesn't just do the landing by himself. There is a landing signal officer on the aircraft carrier who is watching him, vectoring all the pilots, telling them what to do, and if the LSOs don't like what they see, they wave the pilot off. And so Doug is listening to the landing signal officer, he's coming in to make his landing, and the landing signal officer is telling him
"Come right, come right".
But Doug is perfectly lined up. So he does what any good Navy pilot would do under those circumstances. He ignores the landing signal officer because he knows that he's got himself lined up. And the LSO, the landing signal officer keeps saying, "Come right", and Doug doesn't really do it, and then the landing signal officer waves him off, which is weird, because he was perfectly lined up.
So now he has to go around for another try. And again, he's perfectly lined up, and the LSO is saying, "Come right", and this time he figures, "I better do it", so he comes right a little bit more, a little bit, and a little bit, not enough, he gets waved off again. Now he has to come around another go-around and he figures, "I'm going to run out of fuel before he runs out of patience, so I better listen to him."
He tries to follow what the instructions are, and he manages to get the landing done, but he was supposed to do six landings that day, another four landings that night. He messes up all the six landings. He makes them, but they're not good landings, and at the end of the day he's told, "Doug, you just didn't do a very good job today. You're not going to do nighttime landings, that's too risky. You have to repeat your daytime landings tomorrow, and then we'll let you do the nighttime landings. But if you don't do a good enough job tomorrow, that's it. That's the end of your flying career in the Navy." Doug goes into shock because everything was working well when he woke up that morning, and now his flying career may be over, and he doesn't know what happened.
His friends on the ship are there for him, and they come over to him, they're there, and they're telling him really useful things like, "Doug, you've really got to bear down tomorrow." Or, "Doug, this is important." In other words, useless advice, and they're just making him more anxious, they're driving him crazy.
At night he's ready to go to sleep, he's hoping it would be a bad dream, he'll wake up and somehow everything will work tomorrow, but he is just dazed. There's a knock on the door of his cabin and he says, "Go away", because he doesn't want to talk to anybody. It's the landing signal officer who was atypically trying to help Doug. Not that they wouldn't be helpful, but it's not their job. Their job is to just help them land, he's not supposed to be a trainer, but he's also very troubled.
So he knocks on the door, and Doug finally lets him in, and says,
"I don't want you to tell me anything. People have been telling me things, it's not useful."
And the landing signal officer says,
"I'm not here to tell you anything Doug, I just want to talk to you, I just want to ask you something."
"Okay, what do you want to know?"
"I know you're a great pilot. Obviously you had trouble today. Tell me what you're trying to do.""I'm doing what I always do, I line up the nose of the plane on a center line of the landing strip there, and I've got it perfectly lined up, and you keep telling me "Come right, come right"."
"So you're flying an A-6. What did you fly before that?"
"An F-4, that's what I've been flying for years."
"In an F-4 it was either you or you sitting behind a student. In an A-6, you're side by side, so it's not exactly the same.""It's a foot and a half, it's not a big difference, it wouldn't make that much of a difference. Maybe two feet. It's not going to make a difference."
"Are you sure?"
"Yeah. I mean, I just line up the nose of the plane over the center line of the runway."
So the LSO says, "Let's try something"
And if you have a chance, I'd like you to try it as I'm doing this interview, if you don't mind.
He said, "Extend your arm straight out, put your thumb up. That's the nose of your airplane. Close one eye and align your thumb with a vertical line, some place in the room. You've got your nose of the airplane lined up at the center line. Now, move your head a foot and a half over", like Doug was moved over, because now he's flying an A-6. "And pull your thumb back over to that center line.
And you see what happens. It changes the whole alignment of the airplane because you're not on the center line, and it's only a foot and a half, but there's a parallax effect here that Doug wasn't thinking about. And Doug does this little demonstration, and as soon as he does it, he says,
"Ah. I'm an idiot. Obviously that's the problem."The next day he did his six landings during the day, and he had no trouble with them, and the four landings at night, and he went on with his career.
I love that story because the LSO helped Doug achieve an insight. He tried to explain things to Doug, but that wasn't getting Doug anywhere, so he created an environment, he created an experience that allowed Doug to see the inconsistency in his beliefs, and once Doug saw the inconsistency, his mental model changed. I think helping people to arrive at insights isn't a question of pushing the insights on the people, or trying to explain it in words as much as helping people to gain the experience so they can see the inconsistency for themselves, then all of a sudden the mental model will shift naturally and easily, and to me that's a gift that good teachers have, to be able to help the people who they're trying to support. They're trying to enlighten their students or colleagues to gain those insights.
Offering multiple perspectives from many fields of human inquiry that may move all of us toward a more integrated understanding of who we are as conscious beings.
Thursday, August 11, 2011
Edge - Insight: A Conversation with Gary Klein
Very interesting . . . . There's also a 45 minute video at the Edge site.
Tags:
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment