The conventional wisdom in classical economics is that we humans are “rational actors” who, by our nature, make decisions and behave in ways that maximize advantage and utility and minimize risk and costs. This theory has driven economic policy for generations despite daily anecdotal evidence that we are anything but rational, for example, how we invest and what we buy. Economists who embrace this assumption seem to live by the maxim, “If the facts don’t fit the theory, throw out the facts,” attributed, ironically enough, to Albert Einstein.

Dr. Daniel Kahneman and Amos Tversky discover cognitive bias common in humans

But any notion that we are, in fact, rational actors, was blown out of the water by Dr. Daniel Kahneman, the winner of the 2002 Nobel Prize for economics, and his late colleague Amos Tversky. Their groundbreaking, if not rather intuitive, findings on cognitive biases, have demonstrated quite unequivocally that humans make decisions and act in ways that are anything but rational.

Cognitive biases can be characterized as the tendency to make decisions and take action based on limited acquisition and/or processing of information or on self-interest, overconfidence, or attachment to past experience.

Cognitive biases can result in perceptual blindness or distortion (seeing things that aren’t really there), illogical interpretation (being nonsensical), inaccurate judgments (being just plain wrong), irrationality (being out of touch with reality), and bad decisions (being dumb). The outcomes of decisions that are influenced by cognitive biases can range from the mundane to the lasting to the catastrophic, for example, buying an unflattering outfit, getting married to the wrong person, and going to war, respectively.

Information biases and ego biases

Cognitive biases can be broadly placed in two categories. Information biases include the use of heuristics, or information-processing shortcuts, that produce fast and efficient, though not necessarily accurate, decisions and not paying attention nor adequately thinking through relevant information.

Ego biases include emotional motivations, such as fear, anger, or worry, and social influences such as peer pressure, the desire for acceptance, and doubt that other people can be wrong.

When cognitive biases influence individuals, real problems can arise. But when cognitive biases impact a business, then the problems can be exponentially worse. Just think of the Edsel and the Microsoft Kin. Clearly, cognitive biases are bad for business. Cognitive biases are most problematic because they cause business people to make bad decisions.

In my corporate consulting work, where I help companies make good decisions, I have identified 12 cognitive biases that appear to be most harmful to decision making in the business world. Some of these cognitive biases were developed and empirically validated by Kahneman and Tversky. Others I identified and subsequently passed the “duck” test (if it looks like a duck and sounds like a duck, it’s probably a duck).

Information biases include:

  • Knee-jerk bias: Make fast and intuitive decisions when slow and deliberate decisions are necessary.
  • Occam’s razor bias: Assume the most obvious decision is the best decision.
  • Silo effect: Use too narrow an approach in making a decision.
  • Confirmation bias: Focus on information that affirms your beliefs and assumptions.
  • Inertia bias: Think, feel, and act in ways that are familiar, comfortable, predictable, and controllable.
  • Myopia bias: See and interpret the world through the narrow lens of your own experiences, baggage, beliefs, and assumptions.

Ego biases include:

  • Shock-and-awe bias: Belief that our intellectual firepower alone is enough to make complex decisions.
  • Overconfidence effect: Excessive confidence in our beliefs, knowledge, and abilities.
  • Optimism bias: Overly optimistic, overestimating favorable outcomes and underestimating unfavorable outcomes.
  • Homecoming queen/king bias: Act in ways that will increase our acceptance, liking, and popularity.
  • Force field bias: Think, feel, and act in ways that reduce a perceived threat, anxiety, or fear.
  • Planning fallacy: Underestimate the time and costs needed to complete a task.

Think about the bad decisions that you and your company has made over the years, both minor and catastrophic, and you will probably see the fingerprints of some of these cognitive biases all over the dead bodies.

You Can Fight Back

The good news is that there are four steps you can take to mitigate cognitive biases in your individual decision making and in the decisions that are made in your company.

  1. Awareness is a key to reducing the influence of cognitive biases on decision making. Simply knowing that cognitive biases exist and can distort your thinking will help lessen their impact. Learn as much as you can about cognitive biases and recognize them in yourself.
  2. Collaboration may be the most effective tool for mitigating cognitive biases. Quite simply, it is easier to see biases in others than in yourself. When you are in decision-making meetings, have your cognitive-bias radar turned on and look for them in your colleagues.
  3. Inquiry is fundamental to challenging the perceptions, judgments and conclusions that can be marred by cognitive biases. Using your understanding of cognitive biases, ask the right questions of yourself and others that will shed light on the presence of biases and on the best decisions that avoid their trap.

Though brainstorming and free-wheeling discussions can be valuable in generating decision options, they can also provide the miasma in which cognitive biases can float freely and contaminate the resulting decisions. When you establish a disciplined and consistent framework and process for making decisions, you increase your chances of catching cognitive biases before they hijack your decision making.


Three Key Questions

Daniel Kahneman recommends that you ask three questions to minimize the impact of cognitive biases in your decision making:

  1. Is there any reason to suspect the people making the recommendation of biases based on self-interest, overconfidence, or attachment to past experiences? Realistically speaking, it is almost impossible for people to not have these three influence their decisions.
  2. Have the people making the recommendation fallen in love with it? Again, this is almost an inevitability because, in most cases, people wouldn’t make the recommendation unless they loved it.
  3. Was there groupthink or were there dissenting opinions within the decision-making team? This question can be mitigated before the decision-making process begins by collecting a team of people who will proactively offer opposing viewpoints and challenge the conventional wisdom of the group.


In answering each of these questions, you must look closely at how each may be woven into the recommendation that has been offered and separate them from its value. If a recommendation doesn’t stand up to scrutiny on its own merits, free of cognitive bias, it should be discarded.

Only by filtering out the cognitive biases that are sure to arise while decisions are being made can you be confident that, at the end of the day, the best decision for you and your company was made based on the best available information.

Here’s a simple inventory I developed to help you identify which cognitive biases you are most vulnerable to:==Related Best Practices==


Resources

Author

The author of this page is Dr. Jim Taylor

Dr. Jim Taylor is an internationally recognized authority on the psychology of performance in business, sport, and parenting.

Dr. Taylor has been a consultant to and has provided individual and group training to businesses, sports teams and medical facilities.executives and businesses throughout the world.

Dr. Taylor received his Bachelor’s degree from Middlebury College and earned his Master’s degree and Ph.D. in Psychology from the University of Colorado. He is a former Associate Professor in the School of Psychology at Nova University in Ft. Lauderdale and a former clinical associate professor in the Sport & Performance Psychology graduate program at the University of Denver. He is currently an adjunct faculty at the University of San Francisco and the Wright Institute in Berkeley.

Dr. Taylor’s professional areas of interest include corporate performance, sport psychology, coaches education, child development and parenting, injury rehabilitation, popular culture, public education reform, and the psychology of technology.

He has published more than 700 articles in scholarly and popular publications, and has given more than 800 workshops and presentations throughout North and South America, Europe, and the Middle East. Dr. Taylor is the author of 14 books.

Dr. Taylor blogs on business, technology, sports, parenting, education, politics, and popular culture on this web site, as well as on huffingtonpost.com, psychologytoday.com, seattlepi.com, and the Hearst Interactive Media Connecticut Group web sites. His posts are aggregated by dozens of web sites worldwide and have been read by millions of people. A full biography is found on Dr. Taylor's website.