Here is an excerpt from an article written by Cheryl Strauss Einhorn for Harvard Business Review and the HBR Blog Network. To read the complete article, check out the wealth of free resources, obtain subscription information, and receive HBR email alerts, please click here.
Credit: Johannes Schwaerzler/EyeEm/Getty Images
* * *
As we’re battling a virus that scientists still don’t fully understand, watching the stock market sink, then soar, then sink again, and facing a contentious election, the future seems completely unpredictable (instead of merely as unpredictable as it has always been). When we feel such heightened uncertainty, our decision-making processes can break down. We may become paralyzed and afraid to act, or we may act on the basis of bias, emotion, and intuition instead of logic and facts.
Being aware of our uncertainty is a necessary precursor to managing it. Effective awareness means pausing, taking a strategic stop, and assessing the situation and the unknowns. We’re now being confronted with data that looks actionable — even though logically, we know it’s incomplete and volatile. But even when knowledge is limited, we have tools to help us make decisions systematically and analytically. Whether we’re assessing the meaning of the latest unemployment numbers or the impact of local romaine lettuce shortages, we can use a simple four-step process to work with and through ambiguity to make careful, reasoned decisions.
[Here’s the first two steps.]
1. Identify the category of historical data you are working with.
There are three main kinds of data we often confront and feel compelled to act on: salient data, which captures our attention because it is noteworthy or surprising; contextual data, which has a frame that may impact how we interpret it; and patterned data, which appears to have a regular, intelligible, and meaningful form.
2. Recognize which cognitive biases are triggered by each category.
Different kinds of data trigger different biases, so identifying the data type and its related bias makes it easier to escape mental mistakes.
- Salient data can activate salience bias, in which we overweight new or noteworthy information, resulting in suboptimal decision-making, planning errors, and more. For example, airline passenger demand in April 2020 plunged 94.3% compared with April 2019, because of Covid-19-related travel restrictions. That shocking statistic might make us think that travel as we have come to know it is finished — but in reality, this one salient piece of data tells us almost nothing about future travel.
- Contextual data can constrict our thinking and lead to a framing bias: The context in which we receive the data impacts how we think about it. For example, “80% lean ground beef” sounds more healthful than “beef with 20% fat.” But it’s the same beef, framed differently.
- Patterned data often prompts the clustering illusion — also known in sports and gambling as the “hot hand fallacy” — whereby we assume that random events are information that will help us predict a future event. The human brain is wired to look for patterns, sometimes when they don’t exist. Equally important, when patterns do exist, they often don’t have predictive value. A die that turns up a two several times in a row has established a pattern, but that says nothing about what the next roll will be.
Recognizing how each of these categories triggers our biases can prevent us from falling prey to those biases, but how do we move forward once we’ve accepted that we need additional information or insight to confidently make decisions about the future?
* * *
Here is a direct link to the complete article.