uprooting bias

April 1, 2019

Don’t always believe what you think.

There are many types of biases, and many definitions, but at heart all forms of bias come back to the same thing—a predisposition towards a certain object, idea, group of people, course of action, et cetera, and a corresponding predisposition away from all other objects, people and so on. We play favorites; we make up our minds about what we want to do without analyzing objectively what we should do; we form judgments about people based on what we believe, rather than what we know. We believe a piece of news or information because we want to believe it, because it conforms to our own beliefs about the world, without regard to whether it might actually be true; and when President Trump rails against fake news, he is, of course, talking about news that says things he does not want to believe. That is an example of bias.

Most of us are not as obviously biased as Mr Trump, but there is an element of bias present in all of us. No one ever makes an entirely objective decision; no matter how hard we try to put our biases to one side, an element of subjectivity always creeps in. However, if we are aware of our own biases and admit them, we stand a better chance of introducing a corrective element into decision-making, and coming to a better and more realistic decision. In this article, I will look at some of the sources of bias, the consequences of allowing bias to take control, and what we can do to overcome them and make better decisions, in business and, indeed, in life.

types of bias

The list of biases—political, cultural, social, economic—is very long and I do not propose to even attempt to discuss them all here. From a business perspective, perhaps the most important biases are cognitive biases, that is, biases that lead us to make decisions that are not fully rational but instead reflect our preconceptions about the subject. Cognitive biases can have two sources: (01) internal, reflecting our brains’ inability to always process information fully or correctly, often through lack of prior information on which to make a decision, and (02) external, social pressures and other influences we face that push into making irrational decisions. Advertising could be said to be a deliberate attempt to create bias on part of the audience by influencing them into a bias towards the advertiser’s products or service, even if there is no rational reason for the audience to do so.

I have talked about bias influencing the making of decisions, but actually there are two kinds of biases we need to consider: bias in perception and bias in actual decision-making. Bias in perception means we see or hear things inaccurately; we do not always hear the words spoken to us, or see what is in front of our eyes, partly because our brains are conditioned not to do so. Women in boardrooms often remark on how they can sometimes appear to be inaudible; they offer a suggestion or opinion and no one listens. Five minutes later, a man offers the same suggestion or opinion and this time everyone listens or nods wisely. This is the result of a bias in members of the male audience who—even if only sub-consciously—tune out what women are saying because they believe it is of less value.

Bias in perception means that because we have barriers in our minds against certain types of information, and therefore we make decisions based on incomplete data, what Margaret Heffernan refers to as ‘willful blindness’. If we blind ourselves to some aspects of the situation we are in, then the chance of us making a bad decision increases exponentially. But even if we do make every effort to understand the facts and exclude nothing, even if we have something close to perfect information, there is still the possibility of bias in the decision-making process itself. The information may be there but we may choose to ignore it; external social pressures like the herd instinct may channel us into making decisions we know are wrong but which we make anyway; or we may simply lack the necessary cognitive function to interpret and analyze the information we have at hand and come to the correct decision. This is often known as lack of managerial capacity or lack of managerial competence; to be blunt, it means that managers are not up to the task of making the decision they need to make.

sources of bias

Rather than run through an exhaustive typology of biases, I will look instead at the sources of bias. Where does it come from? What goes on in our heads that leads us to make irrational decisions? Sifting through lists of biases, I can identify four important causal factors (there are almost certainly many more, but these are the ones that seem to be responsible for the majority of biases):

  • fear, anxiety, and stress
  • self-delusion
  • selfishness and the desire for power
  • mental laziness


As human beings, we are subject to a number of fears, many of them partly irrational. Most of us are, for example, frightened of the unknown. What is not known cannot be trusted and therefore represents a potential threat. In order to make this fear go away, we try to rationalize the unknown and make sense of it. Famously, managers hate uncertainty, and adopt a wide variety of frameworks to analyze uncertain things and reduce them to certainty. As Pablo Triana pointed out in his book Lecturing Birds on Flying, many of these tools actually serve to increase uncertainty. For example, statistical methods used to price options or risk more accurately actually result in less accuracy and higher level of danger, a fact which Triana says has been partly responsible for several major financial disasters.

One common bias stemming from a fear of the unknown is framing the filters or lenses we use to interpret what we see around us. Lacking certainty about what we see, we fall back on past experience, but where we lack experience we turn instead to beliefs, superstitions, and fears. Enlightenment philosopher David Hume spoke of a bias towards the familiar, a tendency to prefer the company of people like ourselves because we can be more certain of who they are. Without certainty, we turn to framing, often with unpleasant results. Much racial bias is founded on framing; when we see the other person we see not them as they really are, but as our beliefs would have them be. You are from such-and-such a country, therefore I believe you will behave in certain ways.

Another attempt to make sense of the unknown is apophenia, the desire to see patterns where none in fact actually exist. We look at a mass of data and are uncertain what it means, at which point apophenia takes over and we start to look for patterns illogically. For example, we see human faces in clouds, or rocks, or loaves of bread, even though we know logically they are not there. This is harmless, but sometimes we start to believe the face really is there, and then apophenia becomes dangerous. One common form of apophenia is the gambler’s fallacy, when people see patterns in the fall of dice or turn of a roulette wheel and then believe they can predict these. Investors in stock markets are often prone to the same bias.

Another common form of fear is fear of isolation. Human beings are pack animals, and most of them are conditioned to want and need the company of others. The herd instinct is a common form of bias, which results in us going along with the majority for fear of being left out, or fear of standing alone and therefore becoming unpopular and unwanted. This means people will fail to debate issues they know should be debated, or ask questions they know should be answered. They make bad decisions knowing they are making bad decisions—willful blindness, again—and knowing what the consequences might be, but social pressures mean they are frightened to make good decisions.

In his book On the Psychology of Military Incompetence, Norman Dixon describes the attitude of some captains in the Royal Navy towards risk. If an RN captain loses his ship to either enemy action or shipwreck, they are automatically put on trial before a court-martial to determine whether they are at fault. A court-martial brings not only sanctions from the navy, but also the disapproval and opprobrium of one’s fellow officers. To avoid this, some captains will choose not to put their ship in harm’s way and reduce the risk to themselves; even though the situation—sailing into a storm to rescue the crew of a sinking ship, for example—might demand they do the exact opposite.


Self-delusion can be the result of fear, but it can also be an overcompensation for lack of prior knowledge (which itself creates uncertain and, once again, the desire to establish patterns and create certainty). For example, in the overconfidence effect, people may state that they are 100 percent certain about something, whereas studies have shown that they are actually wrong about 20 percent of the time. In her article, The Illusion of Control, Ellen Langer argues that this is another manifestation of the desire to establish control over uncertain facts and bend them to our will. Observer-expectancy bias is another form of this phenomenon, whereby people take the results of research and bend them to fit prior expectations. Usually this is because they do not understand the real results, which take them out of their mental comfort zone; observer-expectancy bias is a method of finding their way back into that zone.

There are many other biases based on overconfidence. One of the most famous is the Dunning-Kruger effect, based on David Dunning and Justin Kruger’s article from 1999, Unskilled and Unaware of it. Dunning and Kruger found that while people with high ability were often very much aware of their limitations—as Confucius said, ‘real knowledge lies in knowing the extent of one’s own ignorance’—people with low ability were not. Low-ability, low-skilled people consistently overestimated their own ability and would take on tasks they were not remotely able to handle. Confidence was not backed up with rationality. More recent studies have found that some people regard themselves as highly intelligent because, when faced with a question, they can look up the answer on the Internet.

The self-serving bias, sometimes known as egocentricity, is another example of people ascribing virtues to themselves, which are not justified by any rational analysis of the facts. Self-serving bias is all too common among business leaders, who are quick to take credit for any successes that might happen along, but are quick to shift blame onto someone or something else. A decade or so ago, I analyzed profit warnings by UK-listed companies for a period of about three years. When things were going well, the board was quick to pat themselves on the back, but when profits began to slide, only a tiny minority was prepared to take accountability and accept that bad managerial decisions were responsible. Even then, many would single out one particular manager as scapegoat, whereupon he or she would conveniently resign a short while later.

Similar to this is the survivorship bias, in which it is assumed that longevity amongst people or institutions is itself a mark of quality or talent. If someone does a job for forty years, we assume—quite illogically—that they are good at their job and know what they are doing, whereas in fact they may be a placeholder who is astute at playing internal political games and has seen off all rivals. The same is true of businesses. Just because a business has been around for a hundred years does not mean it still will be around tomorrow, which makes it very dangerous to learn lessons in an unstructured way from their survival.

selfishness and laziness

Selfishness leads to many similar forms of self-delusion. The sense of superiority here turns into one of entitlement; I am the best, therefore I deserve to get everything I want. Selfishness also leads us, once we have achieved positions of power, to exclude others from those positions, especially if they do not look like us. Power is reserved for our friends, whom we can trust. Those we do not know, who are different from us by reason of gender, ethnicity, religion, and so on, we will exclude because we want to keep power for ourselves and our charmed inner circle. This is not so much about getting power, as keeping it out of the hands of others.

Laziness too results in bias, not because we are incapable of understanding data and making rational analysis, but because we cannot be bothered to do so. The ways we have done things have always worked before; why change now? Close allied to laziness is cynicism, where we do not bother to make the extra effort because we believe there is no point. Sometimes this belief leads to corruption; everyone else around is corrupt, so why should we not help ourselves as well? This belief that corruption is widespread in itself, of course, is a form of bias.

remedies for bias

There are techniques for ‘debiasing’ or cognitive bias modification therapy (CBMT), which are widely employed by psychologists. However, these techniques have been criticized on the grounds that they too are subject to bias. What gives a therapist the right to claim that he or she is objective while the patient is biased? Is it not possible that it could be the other way around?

My own experience suggests that the best cures for bias lie within. When analyzing information and making decisions we need to be clear where our own biases are, and this requires us to be reflective and honest with ourselves. We can never shed our biases entirely, but once we are aware of them, we can account for them and make provision in our thinking. We need to constantly test and challenge ourselves, asking why do we think as we do? Why do we believe what we do? And most of all, we need to remember the words of the American Buddhist nun Thubten Chodron: don’t believe everything you think.