Skip to contentSkip to navigation

Beyond Bias

Neuroscience research shows how new organizational practices can shift ingrained thinking.

A version of this article appeared in the Autumn 2015 issue of strategy+business.

Imagine that you are hiring an employee for a position in which a new perspective would be valuable. But while reviewing resumes, you find yourself drawn to a candidate who is similar in age and background to your current staff. You remind yourself that it’s important to build a cohesive team, and offer her the job.

Or suppose that you’re planning to vote against a significant new investment. This is the second time it’s come up, and you voted no before. A colleague argues that conditions have changed, the project would now be highly profitable, and you can’t afford to lose this opportunity. Upon closer examination, you see that his data is convincing, but you vote no again. Something about his new information just doesn’t feel relevant.

These are examples of common, everyday biases. Biases are nonconscious drivers — cognitive quirks — that influence how people see the world. They appear to be universal in most of humanity, perhaps hardwired into the brain as part of our genetic or cultural heritage, and they exert their influence outside conscious awareness. You cannot go shopping, enter a conversation, or make a decision without your biases kicking in.

On the whole, biases are helpful and adaptive. They enable people to make quick, efficient judgments and decisions with minimal cognitive effort. But they can also blind a person to new information, or inhibit someone from considering valuable options when making an important decision.

People overestimate the degree to which they can control negative effects of a disaster, and underestimate the time and effort it would take to prepare.

A number of biases occur so often, in so many contexts, that cognitive scientists have given them names. (See “Common Biases,” below.) Some, like the confirmation bias (which leads people to discount information that disagrees with their assumptions), have been critical factors in financial crises, including the Great Recession of 2007. This crisis also derived in part from the temporal discounting bias: Bankers chose to pursue immediate gains, even if that meant ignoring long-term risks. Two other common biases, the illusion of control and the planning fallacy, adversely affected Japan’s preparedness for the 2011 tsunami, as well as New York’s ability to recover from Hurricane Sandy in 2012. People overestimate the degree to which they can control the negative effects of a disaster and underestimate the time and effort it would take to prepare for one. All of these biases, and others, lead many great companies and institutions to make disastrous and dysfunctional decisions.

Common Biases

Similarity

Ingroup Bias: Perceiving people who are similar to you (in ethnicity, religion, socioeconomic status, profession, etc.) more positively. (“We can trust her; her hometown is near mine.”)

Outgroup Bias: Perceiving people who are different
from you more negatively. (“We can’t trust him; look where he grew up.”)


Expedience

Belief Bias: Deciding whether an argument is strong or weak on the basis of whether you agree with its conclusion. (“This logic can’t be right; it would lead us to make that investment I don’t like.”)

Confirmation Bias: Seeking and finding evidence that confirms your beliefs and ignoring evidence that does not. (“I trust only one news channel; it tells the truth about the political party I despise.”)

Availability Bias: Making a decision based on the information that comes to mind most quickly, rather than on more objective evidence. (“I’m not worried about heart disease, but I live in fear of shark attacks because I saw one on the news.”)

Anchoring Bias: Relying heavily on the first piece of information offered (the “anchor”) when considering a decision. (“First they offered to sell the car for $35,000. Now they’re asking $30,000. It must be a good deal.”)

Base Rate Fallacy: When judging how probable something is, ignoring the base rate (the overall rate of occurrence). (“I know that only a small percentage of startups succeed, but ours is a sure thing.”)

Planning Fallacy: Underestimating how long it will take to complete a task, how much it will cost, and its risks, while overestimating its benefits. (“Trust me, we can finish this project in just three weeks.”)

Representativeness Bias: Believing that something that is more representative is necessarily more prevalent. (“There may be more qualified programmers in the rest of the world, but we’re staffing our software design group from Silicon Valley.”)

Hot Hand Fallacy: Believing that someone who was successful in the past has a greater chance of achieving further success. (“Bernard Madoff has had an unbroken winning streak; I’m reinvesting.”)

Halo Effect: Letting someone’s positive qualities in one area influence overall perception of that individual. (“He may not know much about people, but he’s a great engineer and a hard-working guy; let’s put him in charge of the team.”)


Experience

Blind Spot: Identifying biases in other people but not in yourself. (“She always judges people much too harshly.”)

False Consensus Effect: Overestimating the universality of your own beliefs, habits, and opinions. (“Of course I hate broccoli; doesn’t everyone?”)

Fundamental Attribution Error: Believing that your own errors or failures are due to external circumstances, but others’ errors are due to intrinsic factors like character. (“I made a mistake because I was having a bad day; you made a mistake because you’re not very smart.”)

Hindsight Bias: Seeing past events as having been predictable in retrospect. (“I knew the financial crisis was coming.”)

Illusion of Control: Overestimating your influence over external events. (“If I had just left the house a minute earlier, I wouldn’t have gotten stuck at this traffic light.”)

Illusion of Transparency: Overestimating the degree to which your mental state is accessible to others. (“Everyone in the room could see what I was thinking; I didn’t have to say it.”)

Egocentric Bias: Weighing information about yourself disproportionately in making judgments and decisions — for example, about communications strategy. (“There’s no need for a discussion of these legal issues; I understood them easily.”)


Distance

Endowment Effect: Expecting others to pay more for something than you would pay yourself. (“This is sure to fetch thousands at the auction.”)

Affective Forecasting: Judging your future emotional states based on how you feel now. (“I feel miserable about it, and I always will.”)

Temporal Discounting: Placing less value on rewards as they move further into the future. (“They made a great offer, but they can’t pay me for five weeks, so I’m going with someone else.”)


Safety

Loss Aversion: Making a risk-averse choice if the expected outcome is positive, but making a risk-seeking choice to avoid negative outcomes. (“We have to take a chance and invest in this, or our competitors will beat us to it.”)

Framing Effect: Basing a judgment on whether a decision is presented as a gain or as a loss, rather than on objective criteria. (“I hate this idea now that I see our competitors walking away from it.”)

Sunk Costs: Having a hard time giving up on something (a strategy, an employee, a process) after investing time, money, or training, even though the investment can’t be recovered. (“I’m not shutting this project down; we’d lose everything we’ve invested in it.”)

In a hyperconnected world, where poor decisions can multiply as if in a chain reaction, breaking free of unhelpful bias has never been more important. That is why many large organizations are putting money and resources toward educating people about biases. For example, U.S. companies spend an estimated US$200 million to $300 million a year on diversity programs and sensitivity training, in which executives, managers, and all other employees are being told to watch out for biases, in particular when making hiring and promotion decisions.

Unfortunately, there is very little evidence that educating people about biases does anything to reduce their influence. Human biases occur outside conscious awareness, and thus people are literally unaware of them as they occur. As an individual, you cannot consciously “watch out for biases,” because there will never be anything to see. It would be like trying to “watch out” for how much insulin you are producing.

How then can the negative effects of bias be overcome? Collectively. Organizations and teams can become aware of bias in ways that individuals cannot. Team-based practices can be redesigned to help identify biases as they emerge, and counteract them on the fly, thus mitigating their effect.

The first step is to identify the types of bias likely to be prevalent in organizations. To that end, we have grouped the 150 or so known common biases into five categories, based on their underlying cognitive nature: similarity, expedience, experience, distance, and safety. (Our research group has named this the SEEDSTM model.) Each category has defining features as well as mitigation strategies specific to that bias. Once you know which type of bias you are dealing with, you can put the strategies in place and make more effective decisions.

Similarity Biases

“People like me are better than others.”

If you are like most people, you are highly motivated to focus your attention on anything that portrays you in the best possible light. This motivation affects the way you perceive other people and groups. The similarity biases are part of your brain’s natural defenses; they promote and protect those associated with you — including your family, team, and company. But they also perpetuate stereotypes and prejudice, even when counterproductive.

The two most prevalent forms of similarity bias are ingroup and outgroup preferences. You hold a relatively positive perception of people who are similar to you (the ingroup) and a relatively negative perception of those who are different (the outgroup). Even when you are not aware of these two biases, they are reflected in your behavior. For example, as described earlier, you are more likely to hire ingroup members — and once you hire them, you’re likely to give them bigger budgets, bigger raises, and more promotions.

Social neuroscience research has shown that people perceive and relate to ingroup and outgroup members very differently. In fact, merely assigning people to arbitrary teams creates great liking for fellow members of the team, less liking of members of another team, and greater activity in several brain regions involved in emotions and decision making (the amygdala, orbitofrontal cortex, and striatum) in response to ingroup faces.

Similarity biases affect many decisions involving people, including what clients to work with, what social networks to join, and what contractors to hire. A purchasing manager might prefer to buy from someone who grew up in his or her hometown, just because it “feels safer.” A board might grant a key role to someone who most looks the part, versus someone who can do the best job. The bias is unfortunate because research (for example, by Katherine Phillips) has shown that teams and groups made up of people with varying backgrounds and perspectives are likely to make consistently better decisions and execute them more effectively.

The best way to mitigate similarity bias is to find commonalities with those who appear different. You can’t change your bias of preference for the ingroup, but you can bring more people into that affiliation. Pay attention (and bring your team’s attention) to the goals, values, experiences, and preferences that you share with the outgroup. This causes the brain to recategorize these individuals and thus create a more level playing field. For hiring and promotion decisions, remove potentially biasing information or features (name, sex, ethnicity) from formal materials. Even though people are aware of ethnicity and gender in any face-to-face encounter, the absence of formal written reinforcing cues can help. Instead, cue similarity: Pepper the documents with references to the ways in which different types of people contribute, or how someone is “one of us.” Studies have found, for example, that considering a man and woman for promotion at one time leads to fairer treatment than considering either person alone.

Expedience Biases

“If it feels right, it must be true.”

Expedience biases can be described as mental shortcuts that help us make quick and efficient decisions. As Daniel Kahneman pointed out in Thinking, Fast and Slow, the human brain has two parallel decision-making systems. “System 1” relies on information that can be retrieved without much effort: Its associations “feel right.” That’s what makes it expedient. When people need to make decisions based on more objective, less accessible information, the brain’s “System 2” has to get involved. System 2 is slower, more difficult to engage, and less pleasurable. It can be called upon to correct System 1’s mistakes, but it requires more cognitive effort and concentration. Most people naturally tend to favor System 1.

One very common form of expedience bias is the availability bias. This is the tendency to make a decision based on the information that’s most readily accessible in the brain (what comes to mind most quickly) instead of taking varied perspectives into account. This bias inhibits us from looking for and considering all potentially relevant information. It can thus block the brain from making the most objective and adaptive decisions. The case of the lost investment described at the beginning of this article shows the subtle, corrosive influence of the availability bias.

Expedience biases tend to crop up in decisions that require concentrated effort: complex calculation, analysis, evaluation, or the formation of conclusions from data. A sales rep or consultant who automatically reverts to a few familiar solutions, instead of really listening to client problems, is probably suffering from an availability bias. So is a doctor who assumes a new patient has a familiar condition, without more carefully analyzing the diagnosis. Even these portrayals of expedience bias could themselves be examples, since they draw conclusions without fully exploring the details of the sales rep’s or doctor’s decision making.

Expedience biases tend to be exacerbated when people are in a hurry or are cognitively depleted — exhausted from stress and multiple decisions. To mitigate the bias, you have to provide incentives for people to step off the easier cognitive path. Create incentives for them to challenge themselves and others, perhaps by identifying their own mistakes and foster a culture that encourages this. For instance, you might relax a deadline to allow more time for considering alternatives, or ask a sales rep to lay out the logic of his or her approach with a client step-by-step, encouraging both yourself and the rep to identify flaws in the logic.

You can also mitigate expedience biases by breaking a problem into its component parts. It may help to involve a wider group of people and get some outside opinions as part of the typical decision-making process, as well as implementing a mandatory “cooling off” period (10 minutes of relaxation or a walk outdoors) before making decisions under pressure.

Experience Biases

“My perceptions are accurate.”

The human brain has evolved to regard its own perceptions as direct and complete. In other words, people tend to assume that what they see is all there is to see, and all of it is accurate. But this attitude overlooks the vast array of processes within the brain that construct the experience of reality. Your expectations, past experiences, personality, and emotional state all color your perception of what is happening in the world.

Experience biases are particularly pernicious when they breed misunderstandings among people who work together. If you hold a strong conviction that you see reality as it is, then you assume that anyone who sees things differently must either be incorrect or lying. As social neuroscientist Matthew Lieberman has noted, when two people each think the other person is crazy, mean, stupid, prejudiced, dishonest, or lazy, there is often an experience bias at work.

It is very difficult to convince someone who has an experience bias that he or she might be the one who is mistaken. These biases are similar to visual illusions — even if you logically know that it is an illusion, your intuitive experience of it remains powerful. You may find it easy to identify other people’s biases, but not your own. (That’s known as the bias blind spot.) You might also fall prey to the false consensus effect: overestimating the extent to which others agree with you or think the same way you do. For example, if you prefer vanilla to chocolate ice cream, you are likely to think that most people have the same preference. People who prefer chocolate, however, will also assume that they are in the majority. In an organizational setting, this assumption can lead to unnecessary conflicts, especially if leaders assume that many others agree with their preferences, and make decisions accordingly.

Experience biases often manifest themselves when you try to influence others or sell an idea. On a sales call, you might not realize that other people are less excited by your product than you are. When making a presentation, you might forget that others do not know the context. If you are a senior leader pushing for a major organizational change, you might not see that others don’t agree, or that they have legitimate concerns.

Experience biases respond to an organizational approach, so put systems into place that minimize their influence. For example, you can set up practices for routinely seeking opinions from people who are not on the team or project. Other techniques include revisiting ideas after a break to see them in a fresh, more objective light, or setting aside time to look at yourself and your message through other people’s eyes.

Distance Biases

“Near is stronger than far.”

Proximity is a salient unconscious driver of decision making. Brain scan studies have shown that one network in the brain registers all types of proximity — conceptual proximity, such as whether or not you own an object, as well as proximity in space and in time. The closer an object, an individual, or an outcome is in space, time, or perceived ownership, the greater the value assigned to it.

For example, given the choice between receiving $100 today and $150 tomorrow, most people will (quite rationally) wait a day to get the larger sum. But when the choice becomes $100 today versus $150 three months from now, the majority will choose the lesser but more immediate payment — despite the fact that there are very few other ways to earn a guaranteed 50 percent return on investment in three months. Thanks to a distance bias called temporal discounting, the further away in time the $150 is, the more its value decreases. Psychologically, $100 is worth more than $150 when time is a factor.

Distance bias often manifests as a tendency toward short-term thinking instead of long-term investment. It can also lead you to neglect people or projects that aren’t in your own backyard — a particular problem for global organizations whose managers must oversee and develop business and human capital at great distances.

To mitigate this kind of bias, take distance out of the equation. Evaluate the outcome or object as if it were closer to you in space, time, or ownership. This orients you to recognize its full value. Of course, you will still consider time and physical distance as factors when making decisions. For example, as business strategy writer Pankaj Ghemawat has pointed out, the geographic and cultural distance of another country should affect any plans you have to expand your business there. But those elements should factor into the decision consciously, without the unconscious influence of a bias that might lead to an inferior conclusion.

Safety Biases

“Bad is stronger than good.”

The fact that negative information tends to be more salient and motivating than positive information is evolutionarily adaptive. A hunter–gatherer whose brain responds quickly to the threat of a snake would be more likely to survive than one whose brain responds first to the charm of its colorful markings. That’s why, for most people, losing $20 feels worse than finding $20 feels good.

This principle manifests itself in safety biases like loss aversion. When considering a transaction or investment, regardless of the merits of the deal, you are likely to be attracted if you perceive it as a way to avoid a loss rather than as a potential opportunity to gain. You may think of yourself as strongly oriented toward winning, but your actions are likely more influenced by the need to avoid losing, a very different concern.

Another safety bias is the framing effect, first identified by Amos Tversky and Daniel Kahneman in 1981. When an opportunity is framed as a gain, people tend to be relatively conscious of the risk involved. But if the risk is framed as a way to avoid a loss, then people are more likely to ignore or justify it. This is true even though the objective information is the same in both cases.

Safety biases can influence any decision about the probability of risk or return, or the allocation of resources including money, time, and people. These biases affect financial decisions, investment decisions, resource allocation, strategy development, or planning for strategy execution. Examples include not being able to let go of a business unit because of resources already invested in the project and not being willing to innovate in a new direction because it would compete with the company’s existing business.

To mitigate safety bias, you can conduct conversations that add psychological distance to the decision. Imagine that you are giving advice to someone in your shoes rather than making the decision for your own enterprise. When making decisions for others, you can be less biased because the threat network is not as strongly activated. Or imagine that the decision has been already been made, and you are seeing it from a later point in time. Studies suggest that recasting events this way, from a more objective, distanced perspective, makes those events less emotional and less tied to the self.

Managing for Bias

All of the mitigation strategies described in this article engage the brain’s ventrolateral prefrontal cortex, which acts, in this case, like a braking system helping you exercise cognitive control and broaden your attention beyond your own, self-specific viewpoint. As you identify and mitigate biases in your organization, keep four general principles in mind:

• Bias is universal. There is a general human predisposition to make fast and efficient judgments, and you are just as susceptible to this as anyone else. If you believe you are less biased than other people, that’s probably a sign that you are more biased than you realize.

• It is difficult to manage for bias in the moment you’re making a decision. You need to design practices and processes in advance. Consciously identify situations in which more deliberative thought and strategies would be helpful, and then set up the necessary conversations and other mechanisms for mitigating bias.

• In designing bias-countering processes and practices, encourage those that place a premium on cognitive effort over intuition or gut instinct.

• Individual cognitive effort is not enough. You have to cultivate an organization-wide culture in which people continually remind one another that the brain’s default setting is egocentric, that they will sometimes get stuck in a belief that their experience and perception of reality is the only objective truth, and that better decisions will come from stepping back to seek out a wider variety of perspectives and views.

You can’t manage for bias in the moment you’re making a decision. You need to design practices and processes in advance.

Although more research and development needs to be done on both the theory and practice of breaking bias, we believe that this approach can provide a useful step forward. By reducing the unhelpful biases that are at the heart of many organizational challenges today, not only do you reduce the risk of catastrophic loss — you redefine what it means for an organization to win.

Reprint No. 00345

Author profiles:

 

Get s+b's award-winning newsletter delivered to your inbox. Sign up No, thanks
Illustration of flying birds delivering information
Get the newsletter

Sign up now to get our top insights on business strategy and management trends, delivered straight to your inbox twice a week.