The Cognitive Shortcut That Clouds Decision-Making

Here is an excerpt from an article by , and for the MIT Sloan Management Review. To read the complete article, check out others, and obtain subscription information, please click here.

Illustration Credit: Dan Page/

* * *

Merely repeating false claims increases their believability, leaving business leaders vulnerable to basing decisions on misinformation. Here are four strategies to prevent this.

Meetings are as effective over Zoom as they are face-to-face. A four-day workweek makes employees more productive. Few complaints means customers are happy. Innovation requires disruption.

Business leaders regularly confront these and similar claims. But what makes people believe that they are true? And, more critically, how do such claims affect strategic decisions?

We live in a time of unprecedented access to information that’s available anytime and anywhere. Even when we don’t actively seek out opinions, reviews, and social media posts, we are constantly subjected to them. Simply processing all of this information is difficult enough, but there’s another, more serious problem: Not all of it is accurate, and some is outright false. Even more worrying is that when inaccurate or wrong information is repeated, an illusion of truth occurs: People believe repeated information to be true — even when it is not.

Misinformation and disinformation are hardly new. They arguably have been impairing decision-making for as long as organizations have existed. However, managers today contend with incorrect and unreliable information at an unparalleled scale. The problem is particularly acute for Big Tech companies like Facebook, Google, and Twitter because of the broad societal effects of misinformation on the platforms. A recent study of the most-viewed YouTube videos on the COVID-19 pandemic found that 19 out of 69 contained nonfactual information, and the videos that included misinformation had been viewed more than 62 million times.1

In the world of corporate decision-making, the proliferation of misinformation hurts organizations in many ways, including public-relations spin, fake reviews, employee “bullshitting,” and rumormongering among current and future employees. Executives can find themselves on the receiving end of falsified data, facts, and figures — information too flawed to base critical decisions upon. Misinformation, regardless of whether it was mistakenly passed along or shared with ill intent, obstructs good decision-making.

All employees, from CEOs to front-line employees, consistently face the challenge of deciding whether a piece of information is true. This is not always an easy task — and it gets complicated by a strikingly banal but powerful bias in how we make sense of information. It’s a glitch of the human mind: We have a tendency to perceive repeated information as more believable than information we hear for the first time, regardless of whether the information is in fact true. Our judgments about the truth of a statement are influenced not only by the actual content of the information but also by our tendency to believe repeated information more than novel information.

Why Does Repeated Misinformation Ring True?

In 1939, U.S. President Franklin D. Roosevelt famously warned, “Repetition does not transform a lie into a truth.” Unfortunately, decades of research have shown that repeating false information can create at least an illusion of truth. Psychologists refer to this phenomenon as the repetition-based truth effect or, in short, the illusory truth effect.2 Information we hear again and again acquires a “stickiness” that affects our judgment, and the effects are neither fleeting nor shallow. The illusory truth effect is in fact extremely robust, and its power has been shown repeatedly across a range of domains, from political speeches to product claims. One unsettling finding is that people fall prey to the effect not only when the original information is still fresh in their memories but also months after exposure. In addition, people judge repeated information as truer even when they cannot recall having previously encountered the information.

Repeated information can be persuasive even when we “know better” — when false claims contradict well-known facts or come from an untrustworthy source. For example, a manager might know that an employee is a notorious gossip but could still be influenced by a rumor the employee spreads. This is because over time, content (the rumor) often becomes disconnected in memory from its source (the untrustworthy employee). It’s the well-known feeling of being sure that you have heard or read a piece of information before but being unable to recall where it came from. Some of our own research suggests that the illusory truth effect even persists when people are offered monetary incentives to make accurate judgments, such as when rewards are offered to employees. Enlisting a trusted expert to counter false information doesn’t help either; studies show that people believe repeated misinformation even when a reliable source argues that the information is incorrect.

While important organizational actions are typically based on a rigorous assessment of the available facts, the illusory truth effect can still influence people when they are gathering information and discussing the decision. For example, a team member might repeatedly but incorrectly argue that moving production to another country won’t hurt the company’s image. This alone might not be the force that drives the decision, but it could be one of the many pieces of information that contributes to the choice. Not every fact can be verified, especially in situations with some degree of uncertainty.

Why do people tend to believe repeated information more than new information? The following scenario shows how this commonly happens.

  • A manager looking to hire a new member of the sales team has short-listed two candidates with comparable profiles, Jane and Susan, and interviews are scheduled for Friday.
  • On Monday, a team member remarks that Jane is very knowledgeable about the company’s product lines. This is new information, and the manager’s mind makes a connection between the candidate “Jane” and the concept “knowledgeable.”
  • On Wednesday, the same team member again mentions that Jane is knowledgeable about the company’s products, reinforcing the existing connection between “Jane” and “knowledgeable.”
  • On Friday, both Jane and Susan say they know a lot about the company’s products. Because the information about Susan is new and the information about Jane is not, the connection between “Susan” and the concept “knowledgeable” is not as strong as the connection between “Jane” and “knowledgeable.” And because the repeated information about Jane feels more familiar than the new information about Susan, the manager processes it more easily. This ease with which we digest repeated information is termed processing fluency, and we use this as evidence of truth. The manager is more likely to hire Jane.

Given the capabilities of the human mind, it seems remarkable that people so easily accept information simply based on their repeated exposure to it. Yet consider: How do most people know that Tim Cook is the CEO at Apple? Or that bitcoin is volatile? It’s because they have encountered this information multiple times in the past. Repetition is central to how people learn and acquire knowledge, and it makes sense that information encountered repeatedly should be more credible. Experience teaches us that most of the information we are exposed to every day is probably factually correct, especially if we encounter it more than once. Accepting repeated information as true helps us navigate an increasingly complex information landscape. The mind has learned a functional shortcut that uses processing fluency as a sign that information is valid and accurate.

Because this shortcut is so efficient, we tend to overrely on it. But in today’s complex and uncertain information ecosystem, quick processing and repetition aren’t sufficient or useful for judging truth. Leaders must not only be able to distinguish facts from falsehood but — critically — also must guard themselves, their teams, and their organizations against being intentionally or unintentionally misled.

* * *

Here is a direct link to the complete article.

* * *

Dear Reader:

If you are so inclined, please ask one colleague, or friend, to sign on by clicking here.

Thank you.

Bob Morris


Posted in

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.