You’re About to Make a Terrible Mistake!
Book Author | |
---|---|
Published | July 14th 2020 |
Pages | 336 |
Greek Publisher | Κάτοπτρο |
How Biases Distort Decision-Making – and What You Can Do To Fight Them
What’s it about?
You’re About to Make a Terrible Mistake! (2020) deals with the negative but often predictable effects that cognitive biases have on high-stakes decisions. Far from claiming that biases can be eliminated altogether, it demonstrates how every decision is colored by bias and outlines specific techniques that’ll help you make more rational, fact-based business decisions.
About the author
Olivier Sibony is a professor at HEC Paris and Oxford University. His research centers around minimizing the influence of cognitive biases on decision-making processes. In English, he has cowritten an article with Daniel Kahneman entitled “Before You Make That Big Decision.” He is also a coauthor of Cracked It! How to Solve Big Problems and Sell Solutions Like Top Strategy Consultants.
Basic Key Ideas
Many business leaders trust their intuition. They’ve been around the block a few times, and they’ve developed a sixth sense for opportunity. It doesn’t matter what kind of decision needs to be made – such leaders always rely on their gut.
Now, for some decisions, that’s fine. But when it comes to business strategy, intuition is not only unreliable; it can be downright unprofitable. For less consequential choices, maybe you can afford a few misfires. But when the stakes are high, there’s no room for failure.
If we take a closer look at the way business decisions are made, we can design a process that leads to better outcomes and less guesswork. Filled with valuable insights and simple strategies, these blinks will help you fend off that invisible foe which often promises the world but delivers a washout – cognitive bias.
In these blinks, you’ll learn
- what makes strategic decisions different from other types of decisions;
- why we often mistakenly attribute a company’s success to a single source; and
- how “a Snapple” has come to mean an especially lousy business deal.
You’ve just read an amazing study. According to the research, drinking a glass of red wine has health benefits comparable to spending an hour at the gym. This is fantastic news, because, first, you love red wine and, second, you’re an investor in fitness facilities!
So what do you do? Well, obviously, you go with your gut and buy yourself a vineyard. No need to think twice. No need to talk to your team. The future of fitness is wine!
Clearly, this would be a terrible mistake. Your personal fondness for the grape does not mean red wine will revolutionize the world of fitness. You are biased toward wine – and before you make any big business decisions, you’ve got to banish that bias.
The key message here is: We can’t eliminate bias. But we can take steps to reduce its effects.
Here’s the bad news: bias is hard to banish. And, within a large organization, it’s pretty much impossible to get rid of. Just consider a complex organization such as NASA: it’s full of individuals, each susceptible to various biases. But that doesn’t mean we should shrug our shoulders and categorize bias as an insoluble problem. Why? Because bias is at the root of many organizational issues.
So how exactly does bias work? Well, if you’ve ever relied on your own subjective experience rather than objective facts when making a decision, then you’ve been led by bias. Bias is most common when people are making judgment calls – and it’s no secret how commonplace judgment errors are, especially when it comes to business strategy.
In one survey, of about 2,000 executives asked, only 28 percent thought their company made good strategic decisions. The majority, about 60 percent, said the ratio of bad decisions to good was about the same, meaning that a coin flip might make as good – if not better – decisions than a team of highly-qualified executives. You can imagine how bad this is for business.
But, again, bias can’t be rooted out entirely. Luckily, it doesn’t need to be. Let’s return to the example of NASA. Sure, it’s full of biased individuals, but it’s also established collective processes that counteract those individual biases.
So how can you follow NASA’s lead and diminish bias within your business? One way is to reconsider your approach to leadership. Rather than being the person who makes the final call, aided by gut instinct alone, you should be someone who organizes how choices are presented. Becoming a decision architect instead of a decision maker, therefore, is the most effective way to lessen the impact of individual bias and steer your strategy in the right direction.
In 1975, two men contacted Elf Aquitaine, a French oil company. The men claimed that they’d developed a revolutionary new technology: a device that could “sniff” oil from the air. All you had to do was put it in a plane and fly around, sniffing out the oil in the terrain below. No guessing. No needless drilling.
France was in the middle of an oil crisis, so this technology couldn’t have arrived at a better time. What’s more, it seemed to work: the two men offered demonstration flights, eventually convincing multiple scientists and Elf Aquitaine’s CEO of its legitimacy. Both the prime minister and the president of France were also convinced.
But, of course, it wasn’t legitimate. The oil it “sniffed” out didn’t exist. The images on the oil-detecting device were preloaded. But the CEO, the president, and the prime minister wanted to believe that it worked – and so they did believe it.
Here’s the key message: When we want to believe a story, we look for information that supports it.
When people want to believe something, they’ll actively seek out evidence that supports that belief while ignoring evidence that undermines it. This is called confirmation bias. Now, confirmation bias can come at a high price. In the case of the French oil scam, that price was roughly 1 billion francs – which is how much Elf Aquitaine paid the con artists before catching on.
Elf Aquitaine’s leaders weren’t preposterously gullible. They were simply victims of confirmation bias. The problem was that there was no process in place to counteract that bias. Elf Aquitaine’s leadership thought they were assessing the technology based on the facts. But, in truth, they were only seeking to confirm specific facts – namely, those that supported the story they wanted to believe.
It may be tempting to make fun of the French government for falling for such a hoax. We might even be tempted to question their intelligence. But confirmation bias knows no country – nor is it merely a phenomenon of the past.
In 2004, in California, a very similar story was pitched to a group of well-known investors, including Goldman Sachs. The purported innovation? Oil-sniffing planes. The tool of trickery? Preloaded images. And just as had happened in France some 30 years before, the investors believed what they wanted to believe – that they were on the cusp of something big and profitable. The place and people were different. The mistake, however, was the same.
Apple. MacBook. iPhone. Which name do these words bring to mind? Well, obviously, Steve Jobs! Jobs’s name has become synonymous with individual genius, groundbreaking design, and world-altering vision.
Why is that?
Well, we tend to remember unusual stories of successful decision-making in business and associate them with groundbreaking innovation. We also tend to forget all the other stories – the stories of failure and strategic decisions gone awry. But forgetting the stories of failure and fixating on the unusual tales of success can spell doom for your business.
Here’s the key message: We often mistakenly ascribe the entirety of an organization’s success to one individual.
So sure – Steve Jobs is the protagonist in Apple’s story. But some 60,000 employees played supporting roles. We often forget this, ignoring all the other factors that contributed to Apple’s success. This holds true for other organizations, too. We tend to assume there was a single mastermind – and that, by imitating that individual, we can also replicate his success.
This is called the attribution error. We like to attribute a company’s success to a single, heroic figure, but in holding out for a hero we undervalue the others involved. We also ignore environmental factors, not to mention luck.
The attribution error comes into play when business leaders try to imitate a successful company’s best practices, too. The problem is that it’s difficult to identify which practice is responsible for an organization’s achievements. The result is that leaders frequently misidentify the source of a company’s success.
For instance, you could ascribe Apple’s popularity to the Genius Bar, an in-person tech support feature unique to their brick-and-mortar locations. If you have an issue with your Apple product, they’ll take a look, diagnose the problem, and advise you accordingly. But let’s say you run a cookie-delivery bakery. Would adding a similar service transform it into a multibillion-dollar corporation? Maybe – but it’s unlikely. It’s also not clear that customers need extensive consultation before ordering cookies.
So remember – just because a successful company has a larger-than-life leader or a unique practice does not necessarily mean it’s successful because of that leader or practice. So it may be more helpful to analyze companies that failed. You can learn from their missteps – let’s call them worst practices – rather than trying to piece together disparate stories of anomalous achievements.
Imagine you’re the CEO of Quaker Oats. The year is 1994, and your last major acquisition, Gatorade, has become a tremendous success. So what’s your next step? Well, you think, if a sports drink could become so popular and lucrative, surely iced tea could, too.
This is exactly how William Smithburg, the former CEO of Quaker Oats, defended his purchase of Snapple for a hefty 1.7 billion dollars. Smithburg’s mistake? Going with his gut instead of with the numbers. Three years later, he ended up reselling Snapple for a fifth of the original buying price. It was such a fiasco that investment bankers now refer to disastrous acquisitions as “a Snapple.”
The key message here is: Strategic decisions do not typically benefit from intuition.
Now, intuition isn’t always useless. But, according to psychologists Daniel Kahneman and Gary Klein, a strategic situation must meet two conditions before intuition becomes a reliable resource.
The first condition is that it must be a high-validity environment. In high-validity environments, causes and effects are generally predictable. A poker game is a good example. The second condition is the possibility to gain extensive practice in the same environment – again, you could play poker for years and the environment would not change.
Practice is important because intuition comes from recognizing environmental cues and noticing irregularities based on past experience. So, next time you’re tempted to make a strategic decision based on intuition, ask yourself: Is the environment high-validity and have I had sufficient practice?
Now, most strategic business decisions are made in what is called a low-validity environment. Here, causes and effects are unpredictable, and there’s no way to practice. Snapple was purchased in a low-validity environment. Sure, Smithburg had made plenty of strategic decisions in the past – but he hadn’t acquired the same company at the same time for the same price in the same market multiple times.
There’s no surefire way to predict the outcome of decisions made in low-validity environments – even if you’re an expert. Psychologist Philip E. Tetlock spent 20 years collecting and analyzing 82,361 predictions of about 300 experts in politics and economics. His findings? The experts’ predictions were not only less accurate than random answers; they were also further off than amateurs who answered the exact same questions.
Business, economics, politics – it doesn’t matter what kind of low-validity environment you’re in. No matter what, your intuitive predictions won’t be of much use. The only way to know whether your prediction was right is to wait and see.
Blockbuster Video. Once a household name – now a nostalgic part of the past. Was it just technological change that brought about its demise, or is there more behind the downfall of this former video-rental giant?
In the year 2000, with 300,000 subscribers, Netflix didn’t have enough turnover to make a steady profit. So it approached Blockbuster with a proposition. Blockbuster could buy 49 percent of Netflix if Netflix could serve as Blockbuster’s online branch. In return, Netflix would let Blockbuster rent out DVDs using its trademark subscription service, which sent people movies in the mail. The price: $50 million.
John Antioco, Blockbuster’s CEO, turned the deal down outright. Netflix CEO Reed Hastings left the negotiation room in defeat.
The key message in this blink is: Overconfidence can cause us to make the wrong decisions.
Fast-forward to 2002. Netflix has gone public, and it now boasts 1 million subscribers. In the meantime, Blockbuster has made a last-ditch attempt to start a competitive subscription service, but to no avail. The writing is already on the wall.
By 2020, Netflix’s market capitalization is valued at around $150 billion – three thousand times the asking price in 2000 – and Blockbuster has long since filed for bankruptcy.
This corporate parable raises an interesting question: Had Antioco been less self-assured – had he taken Netflix seriously and accepted the deal – would Blockbuster still be around today? We can’t say for sure, but we can say that Antioco was overconfident, and that his overconfidence helped seal his company’s fate.
Overconfidence, sometimes called overplacement, also affects other types of decisions.
Antioco was confident that the online subscription model simply wasn’t sustainable, and he was wrong. He overestimated his ability to make a precise prediction – something that many of us do without being aware of it.
Just consider this classic precision test given by J. Edward Russo and Paul Schoemaker. Respondents were asked ten general knowledge questions – for example, “When was Mozart born?” For this question, they gave their answer as a range of years that they were 90 percent certain contained his birth year. Over two thousand people were asked, and 99 percent of participants chose ranges that were too narrow. For the whole test, subjects answered an average of between three and six questions correctly.
In other words, if we think we’re 90 percent certain, we’ll be wrong, on average, about 50 percent of the time.
Three! Two! One! Houston, I’m pretty sure we have liftoff. Wait – yes, everything should be fine. OK, just hold on for one second. Look, I don’t know if senior staff are on standby, but should we just go for it? The skies look pretty clear. I mean, what’s the worst that could happen?
These are not the words you’d want to hear if you were strapped into a spaceship, waiting for launch. And, of course, no astronaut ever would hear them. NASA doesn’t leave shuttle launches up to chance.
Now, the stakes of most strategic decisions are not as high as those of a rocket launch. But all decision-makers can learn something from the rigor of pre-launch protocol.
Here’s the key message: Rigorous analysis of decision-making processes decreases the variability of the results.
When launching a spaceship, NASA uses a series of predetermined procedures to ensure a more predictable outcome. This approach to decision-making is called collaboration plus process. Organizations generally use collaboration plus process for big decisions that leave no room for guesswork – such as when prepping for space travel. This is why there are so many procedures and checklists.
Often, a similar system is in place in business for lower-level decisions, such as the purchase of office supplies. So you’d think these same organizations would have a comparable quality-control method for their strategic decision-making. But, surprisingly enough, they often don’t.
A 2010 study looked at 1,048 investment decisions made by organizations across a range of industries. The goal of the study was to measure to what extent collaboration plus process influenced the success of these deals. The metric of success was return on investment.
Participants were asked a series of questions, both about the analytical tools involved and how the decision team was chosen, as well as how the decisions were reached. The return on investment varied greatly. Thirty-nine percent of the variance was caused by industry-specific factors, while analytical methods, such as financial models, only explained 8 percent. The big surprise? Collaboration plus process explained 53 percent of the variance. In other words, decisions that relied on high-quality processes influenced the investment outcome six times as much as the number crunching did.
It may be time to set the calculator aside and spend more time thoroughly discussing the calculations, including the risks. And remember: your investment thesis is guilty until proven innocent, so actively search for information that challenges it before making your final decision.
Imagine you’re a chef, and you’re about to open a new restaurant in Greenwich Village, New York. You’ve come up with some new dishes that you’re sure are going to be a hit, but you need to ask some people what they think. As you’re putting together an informal focus group, you start wondering whom to invite. Do you ask your grandmother, who thinks everything you do is amazing? Or do you ask your friend – a food critic who is especially hard to please?
We all need a little grandmotherly validation in our lives. But for the sake of your restaurant, it may be more productive to bring in both the food critic and grandma, plus some people in between.
The key message here is: Disagreement and discussion can lead to more rational strategic decisions.
Disputes can be uncomfortable, and many companies try to keep disagreements to a minimum. Avoiding conflict altogether for the sake of comfort, however, makes it more difficult to reach good decisions because it stifles genuine debate. Many meetings are often ceremonial, existing only to confirm the conclusions already reached in private. So how can you encourage constructive dialogue?
Cognitive diversity is essential, so make sure that participants represent a wide range of opinions and skills. In other words, the people involved in the decision-making process should come from different professional backgrounds. When everyone has the same training or experience, there’s a good chance they’ll also share the same biases. Encouraging people to challenge each other can help overcome some biases, such as the overconfidence bias.
Just as it takes longer to plan a wedding than to attend one, we need to set aside adequate time to sort out differences of opinion. A meeting whose members have very similar points of view, on the other hand, will not require much time to reach a consensus.
Many in management positions believe that the goal of a meeting is a decision. The agenda is often geared toward this decision, and if a general agreement isn’t achieved, the meeting is considered an automatic failure. A wise leader can gauge the development status of a topic, and then decide whether a meeting would be better organized around a discussion or, if it’s ready, a decision. Recognizing that different stages of a decision require different types of meetings is the first step to designing an effective process for deliberate decision-making.
The key message in these blinks:
Strategic successes are often unique, whereas strategic mistakes have a lot in common. When faced with strategic decisions, our thought process is subject to individual biases that often make the outcome less rational, and therefore less predictable. It’s almost impossible to overcome these biases, especially when time is of the essence. We can, however, influence the environments in which the decisions are made to lessen the effects of these biases.
Actionable advice:
Framework or criteria?
Many of the more routine decisions made on a daily basis are not unique. In order to stick to the facts, using a framework such as a checklist can help ensure the quality and consistency of repetitive choices. For strategic decisions, on the other hand, it’s more helpful to outline specific criteria well before a final decision has to be made. Over the course of a decision process, general opinion may be pulled this way and that; by frequently referring back to predefined criteria, you can counteract the discussions’ dynamics and avoid making a decision that favors biases over rationality.
What to read next: The Optimism Bias, by Tali Sharot
Now that you’ve gotten a taste of the biases affecting our thinking and a few techniques on designing a more rational decision-making environment, you might be interested in exploring the topic from a different angle.
You’ve seen how cognitive biases such as overconfidence influence your decisions for the worse. Well, The Optimism Bias takes optimism and pessimism as a starting point to investigate how our expectations shape our choices, exploring both the positive and the negative sway of optimism – and how it can pull us down or hamper our judgment. Want to learn more? Then head over to the blinks to The Optimism Bias!