It’s not hard to find a reason to recommend Daniel Kahneman’s book, Thinking, Fast and Slow.

Kahneman’s book helps understand not only human behavior and decision making, but also journalism and how news stories are reported. His work describes the ways people interpret information incorrectly or with bias — flaws in thinking that can seep into journalists’ reporting as well.

Kahneman is Eugene Higgins Professor of Psychology Emeritus at Princeton University and a professor of public affairs at the Woodrow Wilson School of Public and International Affairs. He received the 2002 Nobel Memorial Prize in Economic Sciences, and Thinking, Fast and Slow is a summation of his work with the late cognitive pioneer Amos Tversky. Kahneman and Tversky’s theories offer a new explanation for the psychology of judgment, decision-making, behavioral economics — in short, how we think.


Jon Roemer/FSG Books

The book describes thinking as a combination of two “systems.” System one is the default, fast moving, instinctive, and quick to jump to conclusions. It’s operating, for instance, when you’re driving a car without being particularly conscious of all you’re doing. System two is slow but rational, requires more mental effort and kicks in when you’re solving a difficult problem. The two systems, they argue, compete and interact as people make decisions.

Dive a bit deeper into the book and you’ll learn about why we cling to narratives to explain events, why assumptions can be more powerful than evidence, and how our minds are susceptible to all sorts of outside influences.

I’m so surprised at myself why I agreed to talk. I really don’t do interviews at all. So something must have happened that was especially charming or I was asleep … Let’s assume it was charm.

The journalist’s day job might be described as trying to explain events and the motivations behind them.  What are some of the most common errors you see when you read the news?

DANIEL KAHNEMAN: The obvious one that comes to mind is that the stories are too good. The stories are oversimplified and exaggerate the coherence of the information. That’s something we know. This is something that comes naturally for journalists to do but that’s also what the reading public demands.

There’s a fit between what comes naturally for journalists and what people want. On both sides of this, there’s an eagerness to produce stories that are coherent and to hear stories that are coherent. So the stories are simpler than reality and in some ways “better” than the true stories.

How can journalists use your findings about these two modes of thinking not just to tell better stories but also to make news more accurate?

KAHNEMAN: To make news more accurate — what you probably mean is to make people’s understanding of the news more accurate.

There is a strong temptation for journalists to tell people what people want to hear, both in terms of opinion but even in terms of facts. So when you’re talking about accuracy, it means really stressing the facts. The facts are going to be the most difficult to assimilate for the public, the things they don’t want to hear or ideas they don’t want to understand because they don’t fit their conception of the world.

If you want people to understand things, then the word surprisingly should appear a fair amount. Because news are surprises.

So what we mean by accuracy or conveying accurate information is leaning against people’s biases, which is probably not what comes naturally when you’re a journalist and you have an audience. Your natural tendency is to lean toward the biases of your audience as opposed to leaning against them, but that’s where the issue of accuracy comes in.

Assuming that [journalists] are not telling lies — that they’re just selecting some information and phrasing or describing the information that they do have — this can be done either in favor of the biases of the audience or to lean against them.

You are going to be more accurate and produce more accuracy by leaning against the biases. By having things like the word “surprisingly.” How many sentences begin with “surprisingly”? So if you want people to understand — that would be a question that I would ask.

If you want people to understand things, then the word surprisingly should appear a fair amount. Because news are surprises. The real news is — not just events — but it’s important to have a more accurate picture of the world

So that’s what I think what I mean by “leaning against biases.” Leaning against biases means drawing people’s attention to facts that they normally would not pay attention to or might not even want to pay attention to although they are facts that are important.

In the way you see people think about information, what are the safeguards against succumbing to your own blind spots or biases?

KAHNEMAN: They’re not safeguards just to distinguish facts from interpretation … you know what is factual evidence? What are the overlays of interpretation that are on top of that evidence? This is the thing — quite often, what makes sense are the things you don’t even know. You speculate, you fill in, this is the way to make sense of it.

We don’t really know what the Russians are up to but we try to make sense of it. And in trying to make sense of it, assumptions are made of Putin and his orders … they sound like facts but in fact, they’re guesses.

They’re guesses because in our own thinking, it is not always easy to distinguish between knowing facts from observation and what we guess is a fact.

So the boundaries between guesses and truth, and guesses and facts, are blurry. Trying to show [journalists] that boundary and show them that distinction, that’s the way you safeguard against over-interpretation or biased interpretation

How do you clarify the boundary between fact and guesses? And how do you show that difference  to the audience so they make sense of that?

KAHNEMAN: I think the reader who read the story will not distinguish unless they’re specifically told. They will assume that the story is true. That is, they will assume that the interpretation and the guesses are at the same level as the facts. So unless the journalist very explicitly distinguishes between facts and interpretation, or facts and speculation, the public is going to accept speculation as if they were known to the journalist as facts.

That’s the dependency of people. We are told that Putin did this because of that but in fact, we don’t know but we are speculating. But someone who reads it, especially when you read it quickly, you say, “Oh yes, I’ve just learned from the paper that Putin is doing this because of that. Those are his considerations.” In fact, that was the guess.

So that’s what’s going to happen on the other side.

In the book you discussed how precise numbers are irrelevant to the understanding of a story. Our brains do not comprehend big numbers well. How can journalists present data and stats and facts to encourage more empathy and understanding?

KAHNEMAN: It is so easy to bias people’s interpretation of facts. The best example is an example I cite in the book.

For example, what does exposure to the sun do to the rate of cancer deaths? If you cite the figure as a percentage, it’s a tiny number and it looks negligible. But if you say that 200 people died, etc., that has a completely different impact. So, inevitably, different ways of statistical data or numerical data are going to vary the impact.

Journalists should be aware of that. They probably intuitively bias the way that they describe the story, to get the impact that they wish to have. So that when you want to be impressed by something, you say 200 deaths, when you want them not to impressed, you say it increase the rate from .00002% to .000055% and that can mean the same thing, but have a very different impact to people.

And with the news, [there’s] hardly any way to get it right, I think. So, journalists should be aware of that.

If accurate understanding requires journalists to show humility and transparency, and audiences are prone to accept what they hear, is there much hope that the public will understand events accurately?

KAHNEMAN: No, I mean, that’s the way it is.

When you couple that with the idea that most of the news that people have, they have from media that they select for compatibility with their prior belief, then people are not looking to be surprised. They are looking for news that fits their view of the world. They want news that tells them that villains behave badly and good people behave well. Those are the kinds of things that people are asking for.

The idea — psychologists call it dissonance. It’s like Fox News coming out with something really really nice that Democrats have done or Rachel Maddow coming out and praising Republicans — that would be jarring with what the public expects. They expect the bias, and they’re used to it. They might not like it if they were surprised.

On the flip side, how can readers be more inclined to notice the assumptions and biases that journalists may present?

KAHNEMAN: It really takes a highly sophisticated reader, when they read very surprising results of a poll, to ask, “Well, what’s the sample size of that poll? How accurate was it?” So that just takes sophistication — especially in a numerical domain.

Detecting bias is something that people will do when they don’t like the bias.

Detecting bias is something that people will do when they don’t like the bias. I don’t think that the listeners to MSNBC and FOX News believe that what they’re listening to is biased — they think it’s fair and balanced. And they tend to accept that. So you tend to be sensitive to bias when you don’t like the message.

And I don’t see what can be done about that. Except raise the general level of sophistication.

And how do you do that, raise the sophistication of audiences?

KAHNEMAN: The odd thing is that educating people doesn’t seem to be enough, because we find that educated people are not less polarized than non-educated people in their politics and their biases.

So education by itself does not reduce polarization. It’s interesting and disheartening.

In this older piece from Michael Lewis in Vanity Fair, he asks how you and Tversky came up with such unusual experiments to study ‘human idiocy or irrationality.’ You said you just observed your own errors. How can we do the same?

KAHNEMAN: In the first place, it’s a lot. What Tversky and I were doing and what Michael Lewis is talking about — judgement — is that we knew statistics. We were working in a fairly circumscribed domain where we knew what correct thinking is, then we were looking for cases where our intuitions deviated from correct thinking.

Of course this is completely different from just living your life and recognizing that you’re in a situation where your judgment is wrong and you don’t know the correct answer. Here, what made it easy is that we knew the correct answer and all we had to find was cases in which our intuition pulls us away from the correct answer.

What people can do is they can recognize situations or circumstances under which they are likely to be biased and then they should suspect their own judgment, but that is very difficult to do.

What people can do is they can recognize situations or circumstances under which they are likely to be biased and then they should suspect their own judgment, but that is very difficult to do.

But if you’ve read the book, when you are making predictions, it’s good to remind yourself that given the information and the quality of the information that you have, you should not make extreme predictions because they’re unlikely to be correct. You should not predict rare events with hard confidence.

Those are rules that people can acquire so that when they’re predicting very rare or exceptional events, they can catch themselves and say, “This is what I feel is going to happen, but what I feel is probably wrong.”

And I confess, I don’t do this. I question my own judgment and my judgment is not better than anybody else’s. And I don’t question it much more often than other people do. I’m better at detecting other people’s mistakes than my own.

I think it’s rare that someone goes through their life asking, ‘What am I biased about today?’ 

When you are making important decisions and you want to get it right, you should get the help of your friends.

KAHNEMAN: Yeah, that’s not going to happen. When you are making important decisions and you want to get it right, you should get the help of your friends. And you should get the help of a friend who doesn’t take you too seriously, since they’re not too impressed by your biases.

During a keynote to the Online News Association conference last summer, Nate Silver said all journalists should read your book to help them better understand the world and human intuition. What’s your reaction?

KAHNEMAN: I’m flattered. I didn’t specifically expect it. To the extent that people think this is a book that has interesting information about how the mind works, it’s certainly not specialized for non-journalists.

I can’t tell [journalists] what to do. What I can do is hear what you think you have learned about human intuition and then I can comment on what you think you have learned.

This is really a tough topic for journalists. Once that topic comes up and possibly indications for journalists are raised — then it becomes possible to have an intelligent conversation about whether they are drawing good conclusions about the information in front of them.

So your theories about our two systems of thinking can only really indicate the process, not how to execute?

KAHNEMAN: I can’t come up with suggestions for what journalists ought to be doing differently without knowing sufficiently well what journalists do. One has to be familiar with the world of journalism and then say, well there is this bit that we might do differently if we took into consideration that people are going to respond intuitively rather than reasonably about the facts.

Nate Silver and Ezra Klein have both launched projects focused on explanatory and data-driven journalism. In a world with more data available, should we expect more of this type of reporting?

KAHNEMAN: I don’t know enough about that world to comment about it. It’s an interesting question obviously but it could also be isolated.

We are very impressed by Nate Silver because he predicted 50 states and every reader of Ezra Klein was — I was sorry to see him disappear from the Washington Post. But I’m not sure that there are so many others. There are a few and we talk about them a lot because they are oddities.

We may viewing a trend where in fact, there are few data points. I need more evidence to believe that there’s a general trend to broad fact-based journalism. It’s possible but I’m not sure.

And clearly Nate Silver — what happened with the last election is going to happen along to a few others. He made punditry look ridiculous. That’s a big achievement and I suppose the pundits are shameless but ultimately it is embarrassing to them and I think they’ll be much more restrained in the next election.

It’s going to be extremely interesting to see the educational effect of Nate Silver’s success on the commentary before the next election. I suspect it could really change.

How do you get your news? Any favorite writers or publications?

KAHNEMAN: Of course. I get the news that fit my biases more or less. I like The New York Times, and I like The Washington Post. I’m a consumer of news. I don’t watch television hardly at all. So I “watch” my news in print.

Do you think there’s an advantage to that?

KAHNEMAN: It’s just my aversion to ads, and [TV] is very inefficient.

I find it more efficient to just scan the paper and to pick interesting stories. And when you’re watching television, you’re captive and they decide for you what you should be interested in and how long you should be interested and remain interested. Whereas you’re much more active as a consumer of the printed word because you can pick, or the internet for that matter.

Just any medium where you can control and select what you’re exposed to has that advantage.

You might also be interested in: