Cognitive biasses – an introduction

Reading Time: 12 minutes

Last Updated on March 31, 2021 by Jacob

Did you know our eyes have a transmission rate of 10 million bits per second but yet our conscious cognitive processing ability only has a bandwidth of approximately 120 bits per second? 

First introduced by  Amos Tversky and Daniel Kahneman in 1972, cognitive biases are simplified conclusions and shortcuts our brain makes while processing information that we rely on in life. Cognitive bias is a way our mind interprets information to optimize our survival rate and ability to navigate in a world, where we are continuously exposed to much more than our brains will ever be able to process. 

Cognitive biases influence our belief, way of thinking and every single decision we make, no matter how big or small. Developed millions of years ago, cognitive biases have been crucial to human survival and evolution. In today’s modern era cognitive biases are often an obstacle and considered a systematic error and deviation in rational judgment.  In addition, cognitive biases are also referred to as behavioral economics, and are in everyday life a cornerstone approach in every marketing campaign. With all the information we are continuously exposed to, while memory and attention is a limited resource, cognitive biases simplify the brain’s information processes.

We believe that we are rational and sensible thinkers and rarely will we perceive that, in fact, all of our decisions and views are influenced by cognitive biases. There are countless of biases, but I’ll write about some of the most fascinating, amusing and common ones which I especially find relevant in relation to leadership and management. Before you jump to the conclusion that cognitive biases are all bad, remember that without them we as humans would not have been here today, since those same shortcuts of thinking is what made mankind able to survive through a million year evolution in a world that was much less humane than the world we know today. 

Why do we fall in the “trap” of cognitive biases

From a psychology, biology and neuropsychology perspective, you have to consider that your brain as we know it today is more than a million years old. 

Our brain is developed in a time where the primary purpose was to continuously conclude if objects or circumstances were either to be eaten, or life threatening. 

With every experience, our brain processes information and tries to make sense of it, but somewhere along the way, small shortcuts are made and biases are formed and take up residence in our minds. This happens because of our brain’s limitations, and it happens gradually and slowly over time, so most of us don’t even notice it happening. Cognitive biases increase our mental efficiency by paving the way for quick decisions on good and bad.

A great example of the brain doing shortcuts and jumping to conclusions, is the picture below:

Do you spot the Dalmatian?  Most people say yes. We need very little information for our brain to come up with a conclusion on what we are seeing. Not particularly beneficial today – but a lifesaver tens of thousands of years ago, when we were to determine if what we saw in the high grass was a sable-tiger that would kill us, or a prey we could hunt and eat.

The Dunning-Kruger Effect


This cognitive bias basically states that people who lack skills necessary to perform a task, will overestimate their abilities, whereas people who are knowledgeable and skilled will be less confident – not because they lack knowledge, but because they are aware of this bias and are humble and cautious about their competencies. If you are highly competent on a subject, you are often competent enough to be aware of your blindspots and lack of competency within fields of the subject. 

This is why you will see experts on a certain subject display scholarly humility while someone who knows little will see it simplistically and their brains will be biased into thinking the concept is easy to fathom.


The Halo Effect

The Halo Effect describes our impulse to believe our initial impression of a person or thing or event and solidify it into our overall perception of them. Also known as the “physical attractiveness stereotype” or the “what is beautiful is ‘good’ principle”, this bias influences us numerous times on a daily basis. Originally the Halo effect was discovered by Edward L. Thorndike (1875-1949). Edward studied the army, and found that if one of a soldier’s qualities was rated highly, the other qualified tended to also be rated highly and vice versa. 

Most common examples are:

  • Thinking that attractive people are smarter, better and nicer than unattractive people.
  • Considering products marketed by celebrities and attractive people as superior and more valuable. 
  • Believing that people who exhibit confidence are therefore competent and intelligent.

What influences this is our need to be right and when we form an opinion about something, we tend to look for clues and confirmations that we’re right.

This has a major effect in real life and is why marketing agencies rely on this bias and why we still want to make a great first impression in a job interview.

Human resource and hiring managers in particular should be aware of the halo effect, since it is one of the most common reasons for hiring mistakes. In relation to the halo effect it is worth mentioning “The Horn effect” which is basically the opposite. Example wise it is common that the cognitive bias of the “Horn effect” takes place in a job interview situation. A candidate can make a first bad impression, be less physically appealing, have a bad breath or something alike – which will unconsciously influence the decision of hiring manager.  

Try to take a look on the picture below and think how the “Halo effect” will effect your judgment on the same person left and right:

Confirmation Bias

Confirmation bias is a propensity of seeking out and following information that confirms our already established beliefs. This happens for a couple of reasons, the main ones being:

  • This feeds our feelings of belonging and self-esteem for we keep ourselves surrounded by like minded individuals who validate our beliefs.
  • It limits our mental resources that we use for decision making because we filter out any opposing information.


Examples of The Confirmation Bias are:

  • Selectively only following certain social media groups or influencers that share your views, and deselecting those with opposing views.
  • Refusing to consider all of the facts, when you are seeking information on a given subject. 


People with opposing views on an issue can listen to the same story and come up with completely different conclusions that validate their respective beliefs. This clearly shows that they are indeed being conditioned by confirmation bias. The issue is, this can cause individuals to make lousy decisions, refuse to take into account opposing views or facts and create a polarized society.


The Anchoring Bias

This bias reflects our tendency to favor the first piece of information we hear over later ones.

An example might be hearing a 2019 remake of a song first and favoring it over the original number. Or, a doctor’s first impression of a patient’s condition can create an anchor that can hinder and sometimes cause false diagnosis or influence future diagnostic assessments. While being a fairly common occurrence, not much is known about the causes of this bias. Some research suggests that the source of the anchor information and our mood play a role. Like all other biases, the anchoring bias influences our decision making and can lead to bad choices or make it hard for people to consider all factors in their decision making process.

I will claim the anchoring bias is closely connected and relevant in conjunction with the “halo and horn effect” in relation to job interviews, for instance. First impressions do set an anchor, which influences our decision down the timeline. 

The Curse of Knowledge 

We learn new things all the time. And once we truly start to understand the information we learn, it becomes quite odd to think that there was ever a time when we didn’t know it. That’s where The curse of knowledge comes into play, because we often develop a misconception that others also know and understand this information too, and that it is common knowledge, which is, quite unfair to our counterparts.


Hindsight Bias

The Hindsight bias shares a similarity to The Curse of Knowledge, but in this case, once we come to have information about an event or a person, we tend to think that it was very predictable and obvious to us, and that we could have foreseen it all along. The susceptibility to this bias can be quite tricky, for it often leads to people taking unadvisable risks.

On a personal account I often experience the hindsight bias, in relation to my amateur and spare time stock trading. Continuously I catch myself believing I was close to making the right decision to buy or sell a stock – but yet I didn’t for which I curse myself. 

Negativity Bias

The Negativity Bias is based on our severe dislike of losing or being wrong. People love winning and being right, but from a perspective of “cognitive budget” we hate failing even more. Whenever we are faced with a decision, we seem to put a much higher emphasis on the negatives, than on the potential positive outcomes. This leads to missing out on vital experiences and can lead to irrational escapism. From the business landscape, negativity bias can lead to loss of opportunities, which in the end can be extremely expensive to the business. Another reason for negativity bias should be found in simple Darwinism. Darwinism is all about adaptation and survival. Millions of years ago and up until practically only a few decades ago, wrong decisions and lurking dangers could kill us. Evolution has made our brains much more hardwired to prioritize focus on threats than on opportunities, since the focus on threats in the end is what kept us alive. “Is it a sable tiger hiding in the grass”? – Well… better run right away than go find out!


The Self-Serving Bias

The Self-serving Bias has an important role in protecting our self-esteem and confidence.

It is our tendency to give ourselves credit for success, but in the case of failure, put blame on outside factors, bad luck or other people. An example of this is a football team that, after winning a match, attributes the good outcome to their hard work and practice, but when losing a match just a week later, blames it on bad luck, the weather or the referees. Age, sex and culture are factors but this bias is widely spread across populations. For example, men are more likely to attribute their failures to outside forces and older people tend to take credit for their own success.

Leadership-wise a good example will be project management. If a project is failing or going off track, the less experienced project manager will typically lack self-admittance, and will always first try to find the explanation outside of their circle of control, while the more experienced will start looking within the circle of control and influence. 


The Availability Heuristic

This represents the tendency to estimate the probability of something happening based on how many examples we know and remember. 

This can be reflected in a belief that plane crashes happen quite often, because we can think of a number of instances we saw news coverages about plane crashes. 

The availability heuristic is basically our brain taking a shortcut that saves us time in decision making. If we are thinking of getting vaccinated but have heard of a couple of people having a bad reaction post vaccination, it will lead us into thinking that the probability of our body experiencing unwanted side effects is much higher. An issue in this way of thinking is that it often leads to making bad decisions and unrealistic estimates.

A famous study on the availability heuristic was a group of people, who were asked which of the two there was most of: English words starting with “K” or english words with the 3rd letter as K. Most answered they thought there were more words starting with K, while in fact there are 3 times as many English words containing “K” as the third letter. This happens because our mind can easily come up with words starting with particular letters, while we are not used to thinking at all what the third letter might be. 


The Decline Bias (a.k.a. Declinism)

The decline bias or Declinism is a tendency to favor the past over the present. For example, thinking that certain things, like music, were better and more sophisticated 40 years ago than in the present day. Another common example is older folk commenting how back when they were kids, they used to play outside and were active, while kids today are constantly on their phones and socialize through apps which makes no sense to the older generations. The thing is in all essence, our brains are lazy and we dislike change. We want our lives and our world to make sense, and any change that disrupts that bubble makes us biased to favoring the old ways or the past, because when things change, our way of thinking must also change and keep up.

The Fundamental Attribution Error

This one is quite similar to the Self-Serving bias, in that we look for excuses for our failure in outside factors, but will blame other people for their failures. A typical example is blaming a person’s irresponsibility or laziness for being late, while making contextual excuses for ourselves being late. It may also come from The Availability Heuristic, in that we tend to make judgements based on the information we have at hand.

As a leader of employees you often end in situations where you should ask yourself if you might be biassed. Previous mistakes made by employees, might have set an anchor, which influences your future judgment of potential errors in other fields. You attribute those mistakes to the employee in particular, without reflecting on the broader context and other factors in play. 


The Misinformation Effect

The Misinformation Effect is a predisposition for our memories to be influenced by events that happen after the initial event. The effects of misinformation can be trivial or major and may cause us to completely misremember things. Research indicates that simple things like asking questions can influence our memory about an event, listening to other people’s accounts of the event can skew our memory. What happens is that new information, be it in the form of a question or another person’s account can blend with our own, or sometimes fill in gaps in our memory. Particularly this bias is relevant when we talk about “witness psychology” which in itself is a very interesting subject. Science has proven again and again, that the credibility of witnesses to a potential crime are very little reliable – yet judges and juries tend to put enormous value on eyewitness statements. 


The Actor Observer Bias

The Actor Observer Bias is quite similar to The Fundamental Attribution Error in that we have a tendency to pin our actions to external factors and influences, while we attribute other people’s actions to internal ones. This probably stems from having insight into our own thoughts, behavior and condition, but in the case of other people, we can’t know what they are thinking, feeling or experiencing. This is completely dependent on perspective. Therefore we focus on circumstances for ourselves, but guess the internal forces that influence other people’s actions.

A good example of the actor observer bias is shown in the fact that we will blame a headache or tiredness for not performing well on a test ourselves, and at the same time blame our friend’s laziness or intelligence for the same thing. 


The False-Consensus Effect

This Effect represents people’s tendency to overestimate how much or how many other people agree with their own beliefs, attitudes, values and behaviors. This might as well stem from the Availability Heuristic, since if we have some people in our tight circle who agree with us on certain topics, surely other people must too. This can lead to us thinking that other people share the same opinion on controversial topics, that there are a lot of people similar to us and share our preferences and so on. The reasons for this are various. The people we spend most time with might and often do share similar beliefs and opinions. This tricks us into believing that this way of thinking is true for the majority of people who are not our family or friends and can lead to mistakenly concluding that everyone agrees with us.


The Optimism Bias

This is a tendency to overestimate the probability of good things or outcomes happening to us, while underestimating the probability that negatives will. In short, we are wired to be too optimistic. This bias also has origins in The Availability Heuristic. If you can think of a number of bad things happening to other people, it’s more likely that other people will be affected by negative events and not you. While this bias can lead to making poor health choices like smoking, not eating well and refusing to wear a seatbelt, it also leads to creating a sense of anticipation for the future and gives hope and motivation for achieving and pursuing their ambitions. 

Worth mentioning is the necessity for optimism bias in entrepreneurship. Yet any entrepreneur must possess the ability of critical thinking and do risk’s analysis, there is a tendency that some of the most successful of them are influenced by biases towards optimism. This makes them jump into projects and take risks that most are not willing to take, which ultimately makes few of them come out on top as highly successful. If you take startups into consideration, more than 9 out of 10 of them fail. As an example, in Denmark most private legal entities cease to exist 5 years after establishment. Nevertheless, all of them have one thing in common – believing that their startup will be flourishing. 


 The Forer effect. a.k.a. The Barnum effect.

The Forer Effect describes our tendency to take vague and obscure information and interpret it as something personal and specific to ourselves no matter the fact that the information can be applied to virtually anyone. This is best shown in the case of people reading a daily horoscope and even though vague, the information they read is recognized as something relevant and specific to them. 

Action Bias

Action bias is the tendency think value is only achieved through taking action. Its the tendency to act rather than hesitate, even though both are reasonable options. In the stone age acting before thinking might have been a reasonable choice, and meant the difference between life and death. But today taking the time to consider your options, might in most cases be the best approach.

People feel better when they take action, in opposition to doing nothing. Society in general favors people who are able to be proactive over those who do not. This is common in the corporate world, where this is a usual personality trait among leaders. The action bias is self-reinforcing, since most leaders initially are promoted as a consequence of their ability to act and make things happen. Action bias is closely connected with proactivity which is seen as a positive attribute. In most cases no one is being congratulated for doing nothing. When facing challenges doing nothing is often seen as a bigger taboo, even though the action taken might lead to failure.

A few tips to overcome action bias:

  • Ask yourself if you need to take action right now.
  • List the potential actions you can take.
  • Reflect on the potential actions with your peers.
  • If possible – take the necessary time to reflect on the optimal solution.

Finally, a good visual example on action bias: The goal keeper in football usually has milliseconds to decide what to do. He can either throw himself to the right, left, or keep standing in the middle. Statistically the goal keeper is going to throw himself to one of the sides, since being static in the middle and doing nothing is unspeakable, while the outcome of all three actions could be the same.