Here is an excerpt from an article written by John Beshears and Francesca Gino for Harvard Business Review and the HBR Blog Network. To read the complete article, check out the wealth of free resources, obtain subscription information, and receive HBR email alerts, please click here.
* * *
By now the message from decades of decision-making research and recent popular books such as Daniel Kahneman’s Thinking, Fast and Slow should be clear: The irrational manner in which the human brain often works influences people’s decisions in ways that they and others around them fail to anticipate. The resulting errors prevent us from making sound business and personal decisions, even when we’ve accumulated abundant work experience and knowledge.
Unfortunately, even though we know a lot about how biases like overconfidence, confirmation bias, and loss aversion affect our decisions, people still struggle to counter them in a systematic fashion so they don’t cause us to make ineffective, or poor, decisions. As a result, even when executives think they are taking appropriate steps to correct or overcome employee bias, their actions often don’t work.
What’s the solution? Behavioral economics — the study of how people make decisions, drawing on insights from the fields of psychology, judgment and decision-making, and economics — can provide an answer. Since it is so difficult to rewire the human brain in order to fundamentally undo the patterns that lead to biases, behavioral economics advocates that we accept human decision-making errors as given and instead focus on altering the decision-making context in ways that lead to better outcomes. Managers can use this knowledge to improve the effectiveness of a process or system inside their organizations.
Just as an architect thinks carefully about how to best design environments and physical spaces to avoid inefficiencies, managers can adopt choice architecture. Choice architecture, a term used by Richard Thaler and Cass Sunstein in their 2008 book Nudge: Improving Decisions about Health, Wealth, and Happiness, refers to the way in which people’s decisions can be influenced by how choices are presented to them. Once managers consciously recognize the flawed thinking that is part of human nature, they can find ways to better design decision-making contexts.
But how to do this? Let’s consider an example. Maybe you remember how on Seinfeld, George Costanza would leave his car parked at the office on purpose, so that his boss would think he was working long hours. That’s an attempt to take advantage of what psychologist’s call input bias — the tendency to use signs of effort to judge outcomes, when actually the two may have little to do with each other. In this case, Costanza uses the bias to his advantage, to change the way his boss judged his productivity.
But knowing about this bias can also help managers enhance organizational effectiveness. For instance, by identifying important elements of the “choice architecture” that improves customer experience. In a recent paper, scholars Ryan Buell and Mike Norton (both at Harvard Business School) studied ways in which service organizations could improve customer satisfaction. They found that when a company visually showed the effort it exerted during transactions, customers were more likely to be satisfied while waiting for the service. When people can see the effort expended on their behalf in the delivery of a service — what Buell and Norton call “operational transparency” — they not only mind waiting less, but they actually value the service more.
Here’s how it works. In one of their studies, Buell and Norton created a fictitious travel website and asked people to search for a flight from Boston to Los Angeles. Some people saw a typical progress bar slowing being colored in, but others experienced operational transparency: The site showed each airline it was searching — “Now searching delta.com… Now searching jetblue.com…” — and created a dynamic running tally of the most affordable flights. Although all participants then received the same list of flights and fares, those who experienced this transparency rated the service much more highly than those who simply viewed the progress bar. And when asked to choose between a site that delivered instant results or one that made them wait, but showed its work, most people chose the latter.
To take another example, consider the default bias: To avoid the discomfort of complex choices, individuals usually opt for the default supplied to them even when choosing the alternative does not require much effort. Knowledge of this bias has led to a growing trend among employers to use defaults when presenting their employees with the choice of whether or not to save for retirement in an employer-sponsored savings plan. Companies are increasingly enrolling new hires in pension schemes automatically; individuals need to explicitly opt out if they are not interested in saving for retirement. Because automatic enrollment policies recognize the human tendency to procrastinate taking an important action, even when that action is personally beneficial, such policies lead to large increases in participation in retirement plans.
* * *
Here is a direct link to the complete article.
John Beshears, a behavioral economist, is an assistant professor of business administration at Harvard Business School, where he is co-chairing a new executive education program on behavioral economics.
Francesca Gino, a behavioral scientist, is a professor of business administration at Harvard Business School, where she is co-chairing a new executive education program on behavioral economics. She is the author of the book Sidetracked: Why Our Decisions Get Derailed, and How We Can Stick to the Plan.