Skip to main content

Back in 1996, my husband and I had a heated debate over the former Federal Reserve Board Chairman Alan Greenspan’s use of the term “irrational exuberance” to explain the bull stock market at that time. He agreed with Greenspan that irrationality was the only explanation for some of the ridiculously inflated price to earnings ratios, while I cited social psychological research demonstrating that there is an underlying rationality to seemingly irrational beliefs.

I have been thinking about our debate on “irrational exuberance” as I reflect on the January 6th attack on the Capitol. It would be easy to accept that those who stormed up the steps were somehow uniquely susceptible to conspiracy theories or fake news; that they were driven by their own form of irrational exuberance. But the arguments I made in 1996 against the notion of human irrationality remain valid. The same cognitive processes that shape the worldview of conspiracy theorists are at work inside us. They are part of our fallen human nature and the differences are in scope, not kind.

Rather than seeing people as irrational, we should think of human nature as marked by “bounded rationality,”1 a penchant toward quick intuitive thinking. But any dysfunction in bounded rationality is highly functional in making us efficient and effective in our reasoning. Our brains process incoming perceptions so quickly that we can pull a small child back to the curb in the face of unexpected traffic without having to think twice about it. And it’s not just in high stakes situations that our brains make infinitesimally small and complex split-second decisions. Our day is full of quick cognitive short-cuts or “heuristics” that free up our brain for more important, strategic, and analytical reasoning. Daniel Kahneman, in his book, Thinking Fast and Slow defines a heuristic as “a simple procedure that helps find adequate, though imperfect answers to difficult questions.”2 Think about the ease of eating at a local restaurant. You know the rules so well that you can go through the steps of being seated, ordering, eating and paying the bill with little thought. But go out to eat in a foreign country and another layer of cognitive effort is added as you try to enjoy the meal. You just don’t know the local rules of the game.

The wonders of this cognitive efficiency are only possible through the use of heuristics. But they come with a cost. The heuristics that speed along hundreds of decisions in an ordinary day also shape the not so benign cognitive shortcuts we know as “biases.”  Psychologists have identified over a hundred of these biases that infiltrate our beliefs, our sense of self, and our sense of belonging. Here are a few of the well-known biases, that if left unchecked by thoughtful reflection, leave us gullible to false beliefs:3

    • Confirmation bias: When we want to believe something or someone, we are motivated to find ways to confirm our belief, but when we don’t want to believe something, we easily discount it, even with the flimsiest of evidence. This leads to. . .

    • Selective scrutiny: We don’t deeply analyze conclusions that align with what we already believe but are much more critical in thinking through evidence with which we disagree.

    • Proportionality bias: We tend to believe that big events require a big explanation. It’s difficult to belief that two million people around the globe could have died from the interaction between a human and a bat. Disproportionality makes it easier for some people to believe in global conspiracies or hoaxes.

    • Repetition effect: We tend to remember information that is repeated often and prefer familiar notions to that which is novel.

    • Self-enhancement bias: Because we are naturally motivated to increase our feelings of self-worth, we tend to describe ourselves in positive terms that are usually higher than warranted. We seek out and join groups that make us feel wanted, special, and good about ourselves.

    • Ingroup-outgroup bias: We tend to be more favorable and forgiving toward those with whom we identify, since such favorability also reflects well on us. We tend to emphasize negative aspects of an outgroup in order to strengthen our sense of superiority and are more likely to use stereotypes to describe the beliefs and motives of outsiders.

Belief systems are shaped and reinforced by unexamined biases, but as the research on ingroup-outgroup biases demonstrate, they are not the product of thought processes alone, being formed throughout early development and reinforced by social networks. Shared biases provide consistency, trust, and greater bonds with like-minded people. However, these social benefits make biases nearly impossible to extinguish when people are drawn into tightly knit and closed-off social communities where biases become self-perpetuating and self-reinforcing. The philosopher C. Thi Nguyen describes such communities in terms of “epistemic bubbles,” where relevant voices may be intentionally or inadvertently left out, and “echo chambers,” where other relevant voices are actively excluded and discredited.4 Echo chambers are especially problematic as they can foster group polarization where the beliefs and opinions of the community shift further to the extremes as people have their beliefs reinforced by others and are exposed to new beliefs held by more radical members of the group.

Many in politics, the church, and other civic institutions are calling for unity and reconciliation. Unfortunately, the social psychological research on biases is so robust that the authors cited in this piece, as well as many others in the field, are not optimistic that people can be motivated to change their beliefs and concomitant behaviors. And this pessimism about the possibility of change is especially true with regard to those who are deeply ensconced in “echo chambers.”

But the situation is not completely hopeless. For those of us who want to work for unity, we must commit to the hard work of analyzing how our biases have shaped us. My favorite description of reconciliation comes from Cambridge Professor Philip Sheldrake who wrote that reconciliation does not mean forgetting but “re-remembering in a new way, in a new context where we learn how to remember together rather than continue to trade memories in the same way that we trade blows.”5 In this case reconciliation is not just about finding common ground but interrogating our biases to “re-remember in a new way.”

Yet questioning our own beliefs is difficult since biases also create blind spots. We become more successful in dealing with biased beliefs when our “us” becomes more expansive. Unity, by its definition, requires broadening our social networks. This is where social psychologist Johnathan Haidt finds hope, writing, “if you put individuals together in the right way, such that some individuals can use their reasoning powers to disconfirm the claims of others, and all individuals feel some common bond or shared fate that allows them to interact civilly, you can create a group that ends up producing good reasoning as an emergent property of the social system.”6 This past summer, many saw the power of “us” to think differently about racism after deeply honest conversations with brothers and sisters in Christ who do not share our skin color. Empathy can burst our epistemic bubbles.

The words of President Lincoln in his 1862 annual message to Congress capture the work that is before us:

The dogmas of the quiet past are inadequate to the stormy present. The occasion is piled high with difficulty, and we must rise with the occasion. As our case is new, so we must think anew, and act anew. We must disenthrall ourselves, and then we shall save our country.

It would be easier to stay on the sideline of contempt if I saw those who stormed the capitol on January 6th as merely irrational. But I must own that my bounded rationality—that which can grab a child away from an oncoming car—can also devolve into issuing death threats to political leaders. In this national moment each of us must reckon with how our biases and existing social networks have shaped our thoughts, actions, and inactions if we are to be serious about engaging in reconciliation and unity.

We have met the enemy and he is us.

Footnotes

  1. Kahneman, Daniel. “Maps of Bounded Rationality: Psychology for Behavioral Economics.” The American Economic Review 93, no. 5 (2003): 1449-475
  2. Kahneman, Daniel. Thinking, Fast and Slow (New York: Farrar, Straus and Giroux, 2011), 98.
  3. For a fascinating read on how biases can shape false beliefs, see Forgas, Joseph P., and Roy F. Baumeister. The Social Psychology of Gullibility: Fake News, Conspiracy Theories, and Irrational Beliefs. New York: Routledge, 2020.
  4. C. Thi Nguyen. Echo chambers and epistemic bubbles. Episteme, 17(2) 141-161. 2020
  5. Sheldrake, Philip. A spirituality of reconciliation. Downloaded from http://digitalobby.spu.edu/csfd/wp-content/uploads/sites/60/2018/02/sheldrake_reconciliation.pdf
  6. Haidt, Jonathan. The Righteous Mind: Why Good People Are Divided by Politics and Religion 1st Vintage books ed. New York: Vintage Books, 2014. 105

Margaret Diddams

Dr. Diddams is an Industrial / Organizational Psychologist and Editor of Christian Scholar's Review.

3 Comments