Gold BlogYou Aren’t So Smart: Cognitive Biases are Making Sure of It

Cognitive biases are tendencies to think in certain ways that can lead to systematic deviations from a standard of rationality or good judgment. They have all sorts of practical impacts on our lives, whether we want to admit it or not.



According to Wikipedia, cognitive biases "are tendencies to think in certain ways that can lead to systematic deviations from a standard of rationality or good judgment, and are often studied in psychology and behavioral economics." Far more than simply an exercise in academia, cognitive biases have all sorts of practical impacts on our lives, whether or not we admit it.

A very broad umbrella, cognitive bias comes in many forms, as evidenced by the fact that Wikipedia lists over 170 of them. Some of these biases are more prevalent in certain areas of life than in others.

Below is an infographic from Business Insider, of all places, which is an elementary summary of what it refers to as "20 cognitive biases that screw up your decisions."

 

Cognitive biases
Source: Business Insider

 

But how do these cognitive biases relate to real life? Here are a few examples.

Confirmation bias is a massive hurdle in the context of politically-charged hot-button issues. Have you ever tried to convince a non-believer of the existence of climate change? Similarly, believers in the existence of a higher power tend to view the world through a lens which reaffirms this belief (and vice versa).

Conservatism bias relates to the above, placing a premium on prior information. This helps to answer why it is that steadfast holders of certain of these politically-charged hot-button issues find it difficult to be swayed from their deeply-held convictions.

Information bias is a problem I have, as I suspect others in the data science realm may. There's a reason we like to collect and analyze data... and that's because we like to collect and analyze data. Choosing a hotel can take much more time than it should for those of us suffering from information bias. "More information is not always better."

Pro-innovation bias is prominent in the very broad "tech" domain, and makes me think of the gripe that many often have that current machine learning and AI tools could be directed at important problems right now, instead of repetitively classifying images of cats and dogs, or optimizing cryptocurrency mining power consumption. We need to be realistic about the opportunities that technology provides us to solve real problems, and look earnestly at its limitations.

Recency is something we constantly hear talked about in relation to the current President of the United States. Rumors from the White House often portray President Trump as being most influenced by the last person who had his ear. This may be an extreme example (and may simply be a piece of folklore), but it is an accurate representation of what this bias is, a bias which has the potential to impact any of us getting caught up in the moment. Just think about it: the idea that you could subconsciously place a premium on some piece of information you just learned, over a vast amount of information you already possess on a topic, isn't just a very real possibility; it's also kind of frightening.

Stereotyping is a tough one, as it is simultaneously a cognitive bias which people insist they don't partake in, and one which people vigorously defend as justified based on their experience (which brings confirmation bias into the picture). The fact that one could cast aspersions on an entire group of unrelated people based on the actions of one with a similar skin color (for example) isn't just obviously wrong; it's so wrong that the mere idea that our minds can work in such a way should scare the hell out of you.

Overconfidence relates to the Dunning-Kruger Effect, which is my all-time favorite cognitive bias (assuming it's not creepy to keep track of such things). Overconfidence stems from being too certain of your abilities which leads to taking risks supported by those abilities (possibly leading to a poor outcome). It's not difficult to see how overconfidence in your rope-less cliff-climbing abilities could lead to disaster.

The Dunning-Kruger Effect (not pictured in the above graphic) boils down to the idea that those of inferior cognitive ability overvalue their cognitive ability due to their inferior cognitive ability; there is a flip-side to this as well. Wikipedia does a better job in its description:

[T]he Dunning–Kruger effect is a cognitive bias in which people of low ability have illusory superiority and mistakenly assess their cognitive ability as greater than it is. The cognitive bias of illusory superiority comes from the inability of low-ability people to recognize their lack of ability; without the self-awareness of metacognition, low-ability people cannot objectively evaluate their actual competence or incompetence. On the other hand, people of high ability incorrectly assume that tasks that are easy for them are also easy for other people.

And, of course, blind-spot bias can (wrongly) prevent you from recognizing that any of this is your problem in any way at all.

If this is a topic that interests you, be sure to read my previous offering, based on a fantastic article summarizing and classifying cognitive biases, written by Buster Benson. Buster clearly presents reasoning for why we have developed such obviously-flawed cognitive quirks, and you may find the reasons surprising.

I will leave it to the reader to determine the applicability of cognitive biases to data science, artificial intelligence, and related fields.

 
Related: