We're all prey to cognitive mistakes, says Nobel Laureate Daniel Kahneman. But knowing that can help you avoid them.
A GMJ Q&A Daniel Kahneman, winner of the 2002 Nobel Prize in economics and author of Thinking, Fast and Slow
Thinking, Fast and Slow (Farrar, Straus and Giroux, 2011) is many things: fascinating, compulsively readable, easy to understand, often very funny. But it isn't flattering.
"It isn't particularly ingratiating either," says the author, Daniel Kahneman, Ph.D., Nobel Laureate and professor emeritus of psychology and public affairs at Princeton University. "It was specifically not written as 'a book that can make you a smarter person by following six easy steps.' I've been studying this stuff for 45 years, and I don't think I have gotten a lot smarter."
We're very influenced by completely automatic things that we have no control over.
Yet people seem to think a lot of this book. It was on the "best book of the year" lists for Amazon, The New York Times Book Review, The Globe and Mail, The Economist, and The Wall Street Journal. And even if readers haven't gotten smarter, they have gotten better informed about the way their brains work.
Thinking, Fast and Slow explains that people use two methods of thinking -- System 1 and System 2. System 1 is fast, automatic, and context oriented. It accepts as true whatever seems coherent to your worldview at the moment. If you're worried about your work performance and your boss walks by looking angry, for instance, System 1 will tell you that your boss is mad at you.
System 2 is slower, deliberative, and analytical. System 2 would tell you that your boss' commute is an hour long on a heavily congested road, and his drive might have been horrible that morning. The problem with System 2 is that it is lazy; it doesn't like kicking in. When it does, it literally takes a lot of energy to sustain -- thinking is a glucose-burning activity that involves the whole body, which is why you can feel more tired after a day at the computer than after a day hiking a mountain.
There are ways to work around the Systems, however, that can help you avoid mistakes and be a little less gullible and a little less lazy. As Dr. Kahneman discusses in the following conversation, understanding how you think may not make you smarter or improve your self-esteem, but it will help you use your Systems better.
GMJ: After reading your book, I should know better than to use System 1 thinking, but I'll do it anyway, won't I?
Dr. Kahneman: You are going to do that most of the time because you have no option. You can't question your evidence for everything. System 1 happens automatically. As a voter, you can say, "I really want to vote on the issues" or "I want to vote strategically," but that takes a lot of time and effort.
People make judgments from environmental cues, especially people who depend on television and don't read and don't think a lot. In fact, people who go by System 1 are very influenced by faces -- they judge competence just from photos. We are very influenced by completely automatic things that we have no control over, and we don't know we're doing it. Nobody would say, "I'm voting for this guy because he's got the stronger chin," but that, in fact, is partly what happens.
GMJ: But that's silly. Why would we do that?
Dr. Kahneman: It's called "judgment by representativeness," and we do it a lot. It happens when you make a judgment of what's going to happen by extrapolating in the most direct way possible from what you see, and you do this by choosing the outcome that is most familiar. So if a politician looks presidential, and there is such a thing as looking presidential, System 1 says, "That guy would be a good president." When people are influenced by how presidential the candidate looks, they're usually using representativeness. There was something about President Reagan, for example, that was just very impressive in the way he walked. It looked like a decisive walk, and it made an immediate and very strong impression.
We also infer the quality of decision making from the speed at which people make decisions. Anybody who makes decisions quickly has an advantage. I think by and large the population liked the decision-making style of President Bush more than they like the decision-making style of President Obama. Bush's style was more intuitive and faster, while Obama is very reflective and takes his time. Quite independent of anything else, people make inferences about the decisiveness of an individual based on the speed of their decisions. Again, that's System 1, and there's not much you can do about that.
GMJ: So we make positive inferences about people who appear to make snap decisions. But I've also noticed that we are pretty self-congratulatory about our own snap decisions, which always seem so perceptive and wise.
Dr. Kahneman: Strong impressions can feel that way, yes. Those impressions, those judgments, generally come with a fairly high level of confidence. But they can be mistaken. It's a mistake for people to have confidence in a judgment because it made for a good story when in fact confidence should be based on the quality and quantity of evidence.
If people are failing, they look inept. If people are succeeding, they look strong and good and competent.
Instead, we feel confident, and that's very different than a judgment of the probability that we're right. That kind of subjective confidence comes from cognitive ease, or how easily an idea or an answer comes to mind. Sometimes, the easiest answer is not the correct one. Sometimes the impression is mistaken.
GMJ: Is that part of the "hubris hypothesis" that you mentioned in the book?
Dr. Kahneman: That hypothesis was proposed by a famous professor of finance to explain why so many mergers and acquisitions among large firms fail. The idea is that you look at the other firm, and it seems to be floundering. So you think, "Oh, those managers are inept -- I could do better." That motivates you to buy their company, usually at an inflated price, because you think that you can make that firm perform so much better than it's currently doing.
That's the hubris hypothesis. Quite often, however, management appears to be floundering not because they're inept, but because they face a problem they cannot solve. If that is the case, you'll face the same problem when you acquire the company, and you won't do any better.
If people are failing, they look inept. If people are succeeding, they look strong and good and competent. That's the "halo effect." Your first impression of a thing sets up your subsequent beliefs. If the company looks inept to you, you may assume everything else they do is inept. Then you don't want to change your mind because of the confidence you feel.
GMJ: So what can a person do to avoid hubris?
Dr. Kahneman: A suggestion would be to assume that whenever you see a firm failing, consider that your judgment of their management is likely to be too severe. And whenever a firm is doing extremely well, you should infer that you are probably overestimating the contribution of its managers, because it's very likely if a firm is doing well, it's been lucky. That theory holds true for firms and for people -- even golfers. If you know that Tiger Woods is doing well, then you can infer that he is talented. But, by the same token, you can also infer that he was favored by luck. In the future, the talent will remain, but the luck will not.
GMJ: There are people who will deny luck even exists. They'd say that smarts and a good plan are all you need.
Dr. Kahneman: That's how we fall prey to the planning fallacy. The planning fallacy is that you make a plan, which is usually a best-case scenario. Then you assume that the outcome will follow your plan, even when you should know better. This is what happens with kitchen renovations. You have a plan, and you have a budget, and you have an idea. You simply do not anticipate problems, which you should, because statistics show that you probably will end up spending twice as much money and time as you have budgeted. But people don't anticipate accidents, and they don't anticipate their own changes. We think that "what we see is all there is."
GMJ: "What we see is all there is" is such a common fallacy that you gave it an acronym in the book: WYSIATI.
Dr. Kahneman: That's just how System 1 works. We sometimes have very strong impressions about complicated problems, and we don't allow for what we don't know. System 1 makes the best story possible of the information available. When there is little information or it's of poor quality, we generate the best story we can anyway, and it's the coherence of that story that determines our confidence. WYSIATI means that you don't allow for what you don't know. System 1 really isn't designed to allow for what you don't know.
GMJ: Can you give me an example of how WYSIATI works in practice?
Dr. Kahneman: Think of a politician, for example. If I tell you that he's intelligent and firm, you already begin to think he's good leader. You do not allow for the fact that you know very little and that I could tell you next that he is cruel and corrupt. You form a fairly definite impression on the basis of the first two attributes.
We're not ideal judgment machines. We make judgments on the basis of very little information.
An ideal judgment machine would evaluate the quality of the evidence and see what can be inferred from the evidence. But we're not ideal judgment machines. We make judgments on the basis of very little information, from what we see, and we're very confident about them. That's the way System 1 is designed. It's designed to avoid paralysis. You can't wait for information. Evolution hasn't designed us to wait for information; it has designed us to make decisions.
GMJ: Quickly and before a lion eats us.
Dr. Kahneman: Yes, quickly.
GMJ: But there aren't a lot of lions wanting to eat us anymore. So when we're making these decisions, doesn't it occur to us that we might be lacking information?
Dr. Kahneman: No, it typically doesn't. If I tell you I'm going to give you two adjectives about a national leader, and I say they are "intelligent" and "firm," that seems like enough information. System 1 can make a judgment on that basis. But in fact, there is a whole world of other traits that are relevant to leadership, but you haven't thought of what you didn't know. You just made a judgment on the basis of what you did know.
GMJ: But once I've formed a judgment, will I accept more information and change my mind? Or will I defend my erroneous judgment?
Dr. Kahneman: It takes a lot of work to change our minds. We're strongly influenced by our first impressions. Intuitive judgments come with high confidence. Reminding yourself that you could be in error doesn't help, because you need confidence in order to act. There is an idea that undermining confidence is a destructive thing because it leaves people paralyzed.
GMJ: But if it prevents you from making a mistake . . .
Dr. Kahneman: Yes. In many cases, paralysis would be better than action.
GMJ: So the minute you think, "I'm a genius!" you probably need to think again, right?
Dr. Kahneman: It doesn't even have to be that you're a genius. But when you get a powerful first impression, stopping to think may be important. Sometimes, you should look for a different way of evaluating and judging the evidence -- such as asking people for objective opinions or many other things that you can't do about every question that comes to mind. Most of the time, though, there's nothing we can do; we have to act on our impressions. We can't stop and deliberate all day long.
That's why I wouldn't write a self-help book, because I don't think we can do System 2 all the time. System 2 is too lazy -- it does as little as possible. There really is a "principle of least effort" operating, so we try to get by with our intuition. For some people, thinking is really painful. For anybody, just slowing yourself down is painful, because fluency and cognitive ease are very present. Slowing yourself down means imposing not only additional work but some unpleasantness on yourself.
GMJ: But it can be done.
Dr. Kahneman: Of course. But not quickly or easily.
-- Interviewed by Jennifer Robison
http://gmj.gallup.com/content/153062/Truth-Think.aspx?ref=more
No comments:
Post a Comment