Decision Traps: How a Decision-Making Framework Helps Overcome our Mind’s Hidden Shortcuts
In my last article, I shared three key benefits of implementing a common Decision-Making Framework in your organization. I also laid out a few criteria that should be considered when developing your Decision-Making Framework:
- It should be scalable, meaning you can apply the same Decision-Making Framework to make decisions at the front line of the organization, as well as for complex decisions in the C-Suite or Board Room
- It must address the common decision traps caused by our inherent cognitive bias
Before sharing the Decision-Making Framework that I have used and taught to my teams and that you can deploy across your organization, I want to delve a little deeper into what decision traps are, how they lead to bad decisions, and how adopting a Decision-Making Framework can help mitigate its impacts.
In the 1970s, psychologists Amos Tversky and Daniel Kahneman wrote several groundbreaking papers credited with founding the modern-day discipline of behavioral economics, for which Kahneman was awarded the Nobel Prize for Economics in 2002. Many popular books like Freakonomics and Nudge have their basis in the concepts introduced by Tversky and Kahneman. In one of their most famous papers, Judgement Under Uncertainty: Heuristics and Biases, they describe the problem of cognitive bias in decision-making as follows:
People rely on a limited number of heuristic principles, which reduce the complex tasks of assessing probabilities and predicting values to simpler judgmental operations. These heuristics are generally quite useful, but sometimes they lead to severe and systematic errors.
To say that in simpler terms, we are faced with making millions, if not billions, of decisions daily. What to wear, what to eat, when to leave, who to sit next to, etc. To prevent our brains from exploding, our minds create “mental shortcuts” that make decision-making easier. While these mental shortcuts are incredibly helpful (we would be paralyzed with indecision if we didn’t use them) and generally very efficient, they sometimes lead to catastrophic errors.
For instance, consider the following example:
A neighbor has described an individual as follows: “Steve is very shy and withdrawn, invariably helpful but with little interest in people or in the world of reality. A meek and tidy soul, he has a need for order and structure and a passion for detail.” Is Steve more likely to be a librarian or a farmer?
If you are like most people, your initial intuition is that Steve is more likely to be a Librarian. Would you reconsider your answer if I told you there are roughly 200+ farms for every library in the U.S.? Which facts are more relevant in making your estimate? The fact that there are 200 times more farms than libraries, or that Steve is described as “a meek and tidy soul”? Clearly, it should be the ratio of farms to libraries, but did you even think about that before being provided the data?
So why isn’t it obvious to so many of us to ask about the ratio of farms to libraries? The answer is that our brains just aren’t wired that way. Most of us aren’t very good at mental math, and even if we were mental gymnasts, we wouldn’t have time to do statistical calculations for every decision we make. So, our brains have created simple rules to make decisions faster. In this case, we likely have a “stereotypical” image of a librarian in our mind (whether it is accurate or not is another discussion). Since the description of Steve fit it, we defaulted to saying Steve is more likely to be a librarian.
While these mental models are essential and, in most cases, work really well, they can lead to catastrophic results in others. For instance, when evaluating an investment decision that appears to be similar to other investments that we remember as working out well, we may not always ask for or consider all the data that is truly required to evaluate whether or not it is a good idea.
The value of having a defined, common Decision-Making Framework is that if structured properly, it can help us identify our blind spots and minimize the likelihood that we get caught in a decision trap. It forces us to slow down and ask specific questions that help surface the information we haven’t thought of.
As I introduce the Decision-Making Framework I use in subsequent articles, I will point out some common decision traps that result from our mind’s hidden shortcuts and how the Decision-Making Framework is structured to help minimize them. If you want to make sure that article shows up in your LinkedIn feed, follow me and like this article.
In the meantime, if you want to learn more about cognitive bias, I highly recommend reading Thinking Fast and Slow by Daniel Kahneman.
Thank you for investing your time in this read.
Seeking to empower your team to make better decisions? Schedule a consultation with Chris. There’s no cost.
#enablingempowerment #decisionmaking #cognitivebias #empoweremployees #decisiontraps
Chris Seifert is an operations leader with 25+ years of experience managing high-risk, complex manufacturing operations and advising senior executives on strategy, leadership, culture, and execution. Most recently Chris led Enviva Biomass’s manufacturing operations, first as VP HSEQ and then VP Operations, during a 6-year period in which revenue grew from $450MM to >$1B, plant production increased by >200% through commissioning new assets, integrating acquisitions, and organic growth, while reducing safety incident rates by more than 85% and growing adjusted EBITDA by >250%. As a Partner at Wilson Perumal and Company, Chris founded and grew an Operational Excellence Consulting Practice and became recognized internationally as a leading expert on Operational Excellence (OE), Operational Discipline (OD), and Operational Excellence Management Systems (OEMS). Chris has also served as a Plant Manager for Georgia Pacific and Owens-Corning and served in the US Navy Nuclear Submarine Force as a Supply Officer.