Develop a “Probabilistic” Approach to Managing Uncertainty

Here is an excerpt from an article written by Mike Walsh for Harvard Business Review and the HBR Blog Network. To read the complete article, check out the wealth of free resources, obtain subscription information, and receive HBR email alerts, please click here.

Credit:  bmcent1/Getty Images

* * *

Our new world of sensors, smartphones, and connected devices means more data than ever — but does it also mean that it’s getting easier to make well-informed decisions? Quite the contrary, in fact. What’s more important than how much data you have is how it frames the way you think. Too often, leaders under pressure to appear decisive attempt to deal with complex issues with simple rules or analogies, selectively using data to justify poor judgment calls. But what if rather than trying to be right, you could be less wrong over time?

When faced with uncertainty, how should leaders react? Should they make a big bet, hedge their position, or just wait and see? Investors and traders might be adept at managing risk and unforeseen events, but in other industries, leaders can be blindsided by the unknown. We naturally tend to see situations in one of two ways: either events are certain and can therefore be managed by planning, processes, and reliable budgets; or they are uncertain, and we cannot manage them well at all. Fortunately, there is another approach.

Consider Thomas Bayes, an English statistician and clergyman, who proposed a theorem in 1763 that would forever change the way we think about making decisions in ambiguous conditions. Bayes was interested in how our beliefs about the world should evolve as we accumulate new but unproven evidence. Specifically, he wondered how he could predict the probability of a future event if he only knew how many times it had occurred, or not, in the past. To answer that, he constructed a thought experiment.

Imagine a billiard table. You put on a blindfold and your assistant randomly rolls a ball across the table. They take note of where it stops rolling. Your job is to figure out where the ball is. All you can really do at this point is make a random guess. Now imagine that you ask your assistant to drop some more balls on the table and tell you whether they stop to the left or right of the first ball. If all the balls stop to the right, what can you say about the position of the first ball? If more balls are thrown, how does this improve your knowledge of the position of the first ball? In fact, throw after throw, you should be able to narrow down the area in which the first ball probably lies. Bayes figured out that even when it comes to uncertain outcomes, we can update our knowledge by incorporating new, relevant information as it becomes available.

You can find evidence of Bayesian thinking throughout modern history, from nineteenth-century French and Russian artillery officers adjusting their cannons to account for uncertainties about the enemies’ location, air density, wind direction, and more, to Alan Turing cracking the German Enigma codes during the World War II. Bayes has even influenced the design of AI and machine learning techniques, notably with naive Bayes classifiers, which are a family of algorithms used to predict the category a data object belongs in. They’re used in a wide range of applications from social media sentiment analysis to spam filtering or movie recommendation systems.

For modern leaders, Bayesian thinking has also become increasingly influential. For example, at Amazon, one of the 14 Leadership principles is “Have Backbone; Disagree and Commit” — which, as explained by Jeff Bezos —is a strategy to encourage leaders to avoid wasting time trying to secure universal agreement. Better to commit to a controversial decision, and then gather data and adjust if necessary. At X, Alphabet’s moonshot factory, they consciously celebrate failed projects as a data point that helps them narrow the range of options, and in doing so, accelerate innovation. Similarly, at Spotify, they have developed a framework for exploring the relationship between data and uncertainty that they call DIBB (Data, Insights, Beliefs and Bets). They use it to explicitly identify success metrics for new ideas and opportunities, and create a common language around judging performance.

Data can be imperfect, incomplete, or uncertain. There is often more than one explanation for why things happened the way they did; and by examining those alternative explanations using probability, you can gain a better understanding of causality and what is really going on.

However, thinking probabilistically takes some getting used to, as the human mind is naturally deterministic. We generally believe that something is true or false. Either you like someone or you don’t. There is rarely, for example, a situation when you can say that there is a 46% probability that someone is your friend (unless you are a teenager with lots of frenemies). Our instinct for determinism may well have been an evolutionary innovation. To survive, we had to make snap judgments about the world and our response to it. When a tiger is approaching you, there is really not a lot of time to consider whether he’s approaching as a friend or a foe.

* * *

Here is a direct link to the complete article.

Mike Walsh is the author of The Algorithmic Leader: How to Be Smart When Machines Are Smarter Than You. Walsh is the CEO of Tomorrow, a global consultancy on designing companies for the 21st century.

 

 

Posted in

Leave a Comment





This site uses Akismet to reduce spam. Learn how your comment data is processed.