Simply put, a cognitive bias is a biased way of thinking that conflicts with logic and rationality. As much as we like to call ourselves rational, the truth is that the human psyche is loaded with many cognitive biases.
Being rational, therefore, is a continuous process of being aware of these biases and not letting them color our perceptions, decisions, and judgments.
1) Choice-supportive bias
Your father prepares dinner, saying that he tried a brand new recipe. He assures you you’ll have eaten nothing like it before. When you take your first bite, you realize it really is nothing like you’ve eaten before, but not in a good way. Everyone but your father feels the same.
“Come on! It’s delicious! What’s wrong with your taste buds?” he empties his own plate in seconds, trying to prove his point.
Choice-supportive bias is defending and supporting your own choices, opinions, and decisions even if they have palpable flaws. Like many other biases, it’s an ego thing. We identify with our decisions, perceiving an opposition to them as an opposition to us.
2) Pro-innovation bias
Innovation is, by all means, great, until it involves ego involvement, which it does quite often. This cognitive bias states that an innovator tends to overvalue the usefulness of his innovation and undervalue its limitations. Why shouldn’t he? After all, it’s his innovation.
3) Confirmation bias
We tend to expose ourselves only to information that confirms our belief systems. This cognitive bias is the most widespread and pervasive. Any information that shakes a person’s belief system induces cognitive dissonance in him, rendering him psychologically unstable. Therefore, it is often met with vehement opposition.
4) Conservatism bias
Like the confirmation bias, it has to do with the maintenance of beliefs. It implies favoring prior information over recent information because prior information supports our beliefs and new information may have the tendency to shatter them.
5) Bandwagon effect
You’re likely to hold a belief if it’s also held by the majority. You’re like, “If so many people believe it, how can it not be true?”
But as philosopher Bertrand Russell said, “Even if a million people say a foolish thing, it’s still a foolish thing.” Mark Twain made the point more amusingly, “Whenever you find yourself on the side of the masses, it’s time to pause and reflect.”
6) Ostrich effect
Ignoring negative information by burying your head in the sand like an ostrich. It’s a pain-avoidance mechanism. So-called ‘positive thinkers’ are usually prone to this bias. When something’s wrong, it’s wrong. Hiding from it doesn’t make it right, nor does it mean it’s no longer there.
7) Anchoring bias
Let’s say you’re negotiating a car deal and the car’s priced at, say, 1000 currency units. The dealer expects you to negotiate around 1000 units on the lesser side. So 1000 units is the anchor around which you’ll throw your bargains.
You may get the deal if you pay 900 units because it is close to the anchor. However, if you insist on buying the car for 700 units, then success is unlikely because it’s too far away from the anchor.
In this sense, an anchor is like a reference point around which we make our future decisions. In any negotiation, the person who first sets an anchor has the advantage of steering the deal in his favour because it exploits our anchoring bias.
8) Selective perception
Our expectations, beliefs, and fears sometimes distort the reality that we see.
Let’s say you’re unsure about your self-image because you’re wearing baggy pants that you hate. When you walk past a bunch of laughing people on the street, you might mistakenly perceive that they’re laughing at you because you’re wearing odd-looking pants.
In truth, their laughter may have nothing to do with you.
Overestimating your knowledge and abilities. Experts are more prone to this bias because they think they ‘know it all’. Overconfidence is often the result of having many successful experiences behind you, to the point that you’re blind to new possibilities or outcomes.
Expecting a person to have the traits of a group to which he belongs. It enables us to quickly tell a friend from an enemy when we’re encountering strangers. Sure stereotypes are there for a reason, but it doesn’t hurt to get to know a person before you can make an accurate assessment of their traits.
11) Outcome bias
Judging a decision based on the accidental positive outcome, despite the casual way in which the decision was actually made.
Say you take a huge risk in gambling where you have a 50-50 chance of winning and losing. If you win, it’s going to be a big win and if you lose, it’s going to be a huge loss.
If you do actually win, you tend to believe post hoc that the decision was indeed the right one. In truth, it was just a toss-up. Had you lost your money, you’d be cursing your ‘brilliant’ decision.
12) Gambler’s fallacy
Another gambling bias, though a more insidious one. Here’s what you say when you’re under the grip of this bias:
“I didn’t win in all my previous attempts, which means I’ll surely win in the next one because that’s how the laws of probability work.”
Wrong! If in a game, your chance of winning is 1/7, then it’s 1/7 at the first attempt and 1/7 at the 7th attempt or 100th attempt, any attempt for that matter. It’s not like probability is going to cut you some slack just because you tried 99 times.
13) Blind-spot bias
The tendency to spot biases much more in others than you do in yourself. If, while going through this article, you could only think of others who have such biases and not yourself, then you may have fallen prey to this type of bias.
The fact that I’m noticing a bias in you of noticing others’ bias makes me think I may have fallen prey to this bias too.
14) False cause
We live in a cause-and-effect universe where the cause often immediately precedes the effect. We also live in a universe where a lot of things are happening at the same time.
Other than the real cause, many related and unrelated events also precede the effect that we observe. So, we’re likely to mistake one of these events as the cause of our observed effect.
Just because two events occur in succession doesn’t mean the preceding event is a cause of the succeeding event. False cause bias is the basis of most superstitions.
Say you slip on the street and fall face-first into the ground right after a black cat crosses your path. This doesn’t necessarily mean that the cat, notorious for bringing bad luck, was responsible for your fall (although it could’ve distracted you).
It could very well be that you slipped on a banana peel or that you were so lost in your thoughts you didn’t notice a pit on the ground.
Similarly, when you install a new software program and your computer crashes, it’s tempting to think the software caused the crash. But the real reason behind the crash may have nothing to do with the software.
People rarely engage in arguments or discussions to better their understanding or to increase their knowledge. Mostly, they enter a discourse to win, to one-up their opponent.
One common tactic that debaters use is to misrepresent their opponent’s argument and attack that misrepresentation to better their own position. After all, by exaggerating, misrepresenting or even completely fabricating someone’s argument, it’s much easier to present your own position as being reasonable.
Say you’re discussing nationalism with a friend and express your disapproval at the concept, saying that we should all think of ourselves as global citizens. Agitated, your friend says, “So you’re saying we shouldn’t care about our country and its progress. You’re a traitor!”
16) Slippery slope
Cool alliteration, isn’t it? A person committing the slippery slope bias thinks along these lines…
If we allow A to happen, then Z will also happen, therefore A shouldn’t happen.
Unsurprisingly, the attention is directed away from the issue at hand and people begin to worry about baseless extreme hypotheticals and suppositions.
The best example is of those who oppose gay marriage. “What! We can’t allow gay couples to marry. Next thing you know people will marry their parents, their house and their dog.”
17) Black or white
Seeing only two extreme and opposite possibilities because that’s what you’re shown, whilst ignoring all the other equally possible possibilities that lie in the grey area.
Also known as the false dilemma, this tactic seems to be a favourite of the demagogues because it has the false appearance of being logical and pushes people to choose a better alternative between the two they’re presented with, unaware of the fact that many other alternatives may also exist.
18) Appeal to nature
Also called the naturalistic fallacy, it’s the argument that because something is ‘natural’ it is, therefore, valid, justified, good, or ideal. Sure, many things that are natural are good such as love, happiness, joy, trees, flowers, flowing rivers, mountains, etc.
But hatred, jealousy, and depression are also natural. Murder and theft are also natural.
Poisonous plants and wild animals who attack unwitting picnickers are also natural. Diseases and cancers are also natural. Volcanoes, earthquakes, and hurricanes are also natural.
19) Special pleading
Inventing new ways to hold on to old beliefs, especially when those old beliefs have been proven to be wrong. When the reasons with which support our beliefs are crushed, we craft new ones.
After all, it’s much easier to defend an already-existing belief than to usurp it and induce mental instability in oneself.
Raj was adamant in his belief that the earth was flat. “No matter how far I run in a particular direction, I can never fall off an edge or something,” Vicky reasoned, hoping to change his friend’s mind. “Well, you must be running in the wrong direction then,” Raj replied.
20) Bias Bias
Also known as the fallacy fallacy, it means dismissing a person’s argument solely because he’s committing one or more cognitive biases. Some people simply don’t know how to present their arguments and inadvertently slip into a bias. This doesn’t necessarily mean their point carries no merit.
Sometimes it also takes the form of accusing someone of committing a bias, even if he’s not, in order not to answer their question or deviate from the topic at hand.
Hi, I’m Hanan Parvez (MBA, MA Psychology), founder and author of PsychMechanics. I’ve published one book and authored 400+ articles on this blog (started in 2014) that have garnered over 4.5 million views. PsychMechanics has been featured in Forbes, Business Insider, Reader’s Digest, and Entrepreneur. Feel free to contact me if you have a query.