With his book Thinking, Fast and Slow, Daniel Kahneman emerged as one of the most intriguing voices on the complexity of human thought and behavior. He is a psychologist who won the Nobel Prize in economics for helping to create the field of behavioral economics. He is a self-described “constant worrier.” And it’s fun, helpful, and more than a little unnerving to apply his insights into why we think and act the way we do in this moment of social and political tumult.
Tagged with “book:author=daniel kahneman” (3)
On taking thought
Before a packed house, Kahneman began with the distinction between what he calls mental “System 1”—-fast thinking, intuition—-and “System 2”—-slow thinking, careful consideration and calculation.
System 1 operates on the illusory principle: What you see is all there is.
System 2 studies the larger context.
System 1 works fast (hence its value) but it is unaware of its own process.
Conclusions come to you without any awareness of how they were arrived at.
System 2 processes are self-aware, but they are lazy and would prefer to defer to the quick convenience of System 1.
“Fast thinking,” he said, “is something that happens to you. Slow thinking is something you do.“
System 2 is effortful
The self-control it requires can be depleted by fatigue.
Research has shown that when you are tired it is much harder to perform a task such as keeping seven digits in mind while solving a mental puzzle, and you are more impulsive (I’ll have some chocolate cake!).
You are readier to default to System 1.
“The world in System 1 is a lot simpler than the real world,” Kahneman said, because it craves coherence and builds simplistic stories.
“If you don’t like Obama’s politics, you think he has big ears.”
System 1 is blind to statistics and focuses on the particular rather than the general: “People are more afraid of dying in a terrorist incident than they are of dying.”
When faced with a hard question such as, “Should I hire this person?” we convert it to an easier question: “Do I like this person?“
(System 1 is good at predicting likeability.)
The suggested answer pops up, we endorse it, and believe it.
And we wind up with someone affable and wrong for the job.
The needed trick is knowing when to distrust the easy first answer and bear down on serious research and thought.
Organizations can manage that trick by requiring certain protocols and checklists that invoke System 2 analysis.
Individual professionals (athletes, firefighters, pilots) often use training to make their System 1 intuition extremely expert in acting swiftly on a wider range of signals and options than amateurs can handle.
It is a case of System 2 training System 1 to act in restricted circumstances with System 2 thoroughness at System 1 speed.
It takes years to do well.
Technology can help, the way a heads-up display makes it possible for pilots to notice what is most important for them to act on even in an emergency.
The Web can help, Kahneman suggested in answer to a question from the audience, because it makes research so easy.
“Looking things up exposes you to alternatives.
This is a profound change.”
Tim Harford interviews Daniel Kahneman, a psychologist who won the Nobel Prize in Economics. The author of Thinking, Fast and Slow describes the common mistakes people make with statistics.