Why We Believe
Mitt Romney has a problem. In his quest for the Presidency, no one is quite sure what he believes. In politics, that’s a no-no. A leader without strong beliefs doesn’t inspire much confidence.
However, his opponents seem to run into a bigger problem. They are sure in their beliefs, but often don’t think them through, leading to embarrassing gaffs. One by one, they have fallen by the wayside.
I, for one, appreciate the political theater. There’s nothing more entertaining than watching a stuffed shirt deflated on national television. However, the 2012 Presidential race is about more than shits and giggles and the rhetoric to this point opens up a very important question: Where do we draw the line between belief and thought?
Probably no one has thought about the question concerning belief and thought more deeply than Daniel Kahneman who pioneered behavioral economics. In his new book, Thinking Fast and Slow, he outlines two systems which drive our decisions.
System 1: This is our more instinctual, automatic system. It works very quickly, often without us knowing it. When we “jump to conclusions” we’re using system 1.
Incidentally system 1 is very active in politics. Studies by political scientists have found that voters often make judgements of candidates based on “rapid, unreflective inferences” about their appearance.
System 2: This system reflects our more rational side. We use it when we stop, think and weigh facts. That takes a bit more time and effort.
In other words, system 1 contains beliefs (as well as other biases), and system 2 drives thinking. It’s no wonder why we favor the fast and frugal system 1 over the slow and difficult system 2. System 1 allows us to quickly make a decision and get on with it while system 2 immobilizes us.
So it’s easy to see why we look for beliefs in our leaders. We want leaders to act, not sit around thinking. However, that begs many questions: Where do beliefs come from? How do we know when we can trust them and when they are leading us astray? When should we believe and when should we think?
Spend some time in another culture and you will realize very quickly that your new friends hold very different beliefs than you do and work from very different assumptions. Not only do they reach different conclusions, they ask different questions. Quite often, you will find that you will be expected to accept certain practices without asking why.
Nobel laureate Thomas Schelling calls this “tacit bargaining,” which is a preformed agreement achieved without negotiation and is far more common than you would think. We usually don’t notice it because it arises from local conventions and just seems like the right way to do things. It’s only when we encounter people who hold different beliefs that we even think to question it.
Many would say that this is a purely cultural phenomenon, but evidence suggests that even our most profound beliefs, such as religious and political opinions, have a large genetic component. Studies of identical twins reared apart (pdf) show medium to strong correlations for religious and political opinions, but very weak links in fraternal twins that were separated.
If beliefs, held by politicians or anybody else, are largely a product of accidents of time, place and genetic make-up, how much trust should we put in them?
Benefits of Experience
Of course, beliefs aren’t entirely accidental. Many of our gut feelings we get from experience and we should listen to them. One example of how acting on belief can be crucial is the story of a fireman in Gary Klein’s widely cited book, Sources of Power.
Prominent in Klein’s research was a lieutenant who insisted he had a “6th sense” about fires. He related how once he was fighting a kitchen fire and had a bad feeling. He ordered his men out and a few seconds later the floor where they were standing collapsed. If not for his quick action, he and his men would have been seriously injured, or even killed.
However, upon further examination there were no paranormal abilities involved. The fireman recognized that the fire was too hot, too quiet and wasn’t going out like an ordinary kitchen fire would. He didn’t know exactly what, but his experience told him that this was no ordinary kitchen fire and it wasn’t. The fire was in the basement and coming out through the kitchen.
And that illustrates another source of belief – encapsulated experience. Our system 1 allows us to draw on lessons learned without engaging the rational part of our brain. It also explains why we expect our leaders to have strong beliefs. How can someone lead effectively if they haven’t experienced enough to know in their heart how they think about things?
Chunks of Belief
The fireman’s story illustrates another important aspect of how we act on beliefs. The lieutenant thought he had a “6th sense” specifically about fires, but nothing else. In a sense, he did. He was able to register a number of factors without thinking. Instead of consciously weighing facts, he got a “bad feeling” and acted on it.
Psychologists call that chunking and it’s often present in people with a high degree of expertise in a specific area. Chess masters, for instance, are able to remember the location of every piece on a chessboard from a game that happened months before, but don’t do any better on standard memory tests than anyone else.
Our brains are massively parallel computers in which each neuron is connected to thousands of others. We build those connections, called synapses, over time. As we gather more experiences, synapses multiply and strengthen, wiring our brains to weave disparate pieces of information into familiar groupings that we can act on efficiently.
That’s why we are amazingly good at recognizing patterns, but very slow at calculating and deliberating. We favor system 1 because it’s far more developed than system 2. It’s been evolving much longer and we’re a whole lot better at it.
Thinking and Believing
Beliefs, for all their charms, do have an Achilles heel. They serve us well when dealing with familiar situations, but can lead us astray when confronted with the unknown. Fireman, chess masters and poker players can act instinctively in domains where they are expert, but can be confounded by doing their tax returns or choosing a pension plan.
And that’s the funny thing about beliefs. They do not thrive in the absence of thought, but become sound only when we have thought enough about matters to internalize fact patterns. Much like we practice our multiplication tables until they become automatic, thinking deeply about the world around us builds sound beliefs that we can act on.
System 1, in other words, only becomes reliable through the use of system 2. That’s why it’s important for leaders (and the rest of us) to have strong beliefs. However, we should be wary of those who bought them cheaply, who value the destination but not the journey. Those beliefs are dangerous. Socrates himself warned us about living with unexamined beliefs.
Nevertheless, our beliefs, in a very real sense, make us who we are. They represent the sum total of our experiences and the basis for our future actions. We should choose them wisely.