Friday, November 15, 2024
HomeEducationHow Our Brains Convince Us That We Are Always Right

How Our Brains Convince Us That We Are Always Right

Dr Saheb Sahu

Cognitive Bias vs Heuristic

A cognitive bias is a systemic pattern of deviation from norm or rationality in judgment. A heuristic is a mental method of solving a problem through intuition, trial and error, or informal methods when there is no formal means or formula for solving it ( and often even when there is ). These heuristics are sometimes called rules of thumb, although they are better known as cognitive biases.  Common causes of our biases are: our personal experience and upbringing, experiences of others like our parents, teachers and friends, the culture we live in and the information we process (news, social media). No matter what belief system is in place— religious, political, economic, or social— these cognitive biases shape how we interpret information to fit the way we want the world to be and not necessarily how it really is.

 Some of these biases are: confirmation bias, hindsight bias, self justification bias, sunk cost bias, anchoring bias, endowment effect (status quo bias) and many others.

The Confirmation Bias

It is the mother of all cognitive biases. Confirmation bias is the tendency to interpret new evidence as confirmation of one’s existing beliefs or theories. Confirmation bias has also been termed “my side bias”. One example is how we seek out or interpret news stories. We are more likely to believe a story if it confirms our pre-existing views, even if the evidence presented is shaky or inconclusive. The confirmation bias is particularly potent in political and religious beliefs.

Hindsight Bias (knew it all along phenomenon)

The hindsight bias is the tendency to reconstruct the past to fit with the present knowledge. Once an event has occurred, we look back and reconstruct how it happened, why it had to happen that way, and why we should have seen it coming all along. The hindsight bias is usually on prominent display after a major disaster. Example: NASA’s Challenger disaster or after a major flood or a cyclone. Hindsight bias can be seen in the writings of histories describing outcome of battles.

Self Justification Bias

 The self justification bias is the tendency to rationalize decisions after the fact to convince ourselves that what we did was the best thing we could have done. This bias applies to everything from career and job choices to mundane purchases. Many smart people believe weird things because they are better at rationalizing their beliefs than others. Politics is filled with self-justifying rationalizations.

Attribution Bias

Attribution bias is our tendency to attribute different causes for our own beliefs and actions than that of others. People constantly make attributions judgments and assumptions about why people behave in certain ways. For example, you are driving along the road and another car cuts in front of you in an erratic way. Based on that one observation you may think that the driver is drunk or rude or aggressive. None of that may be true.

Sunk Cost Bias

The sunk cost bias (fallacy) describes our tendency to follow through an endeavor if we have already invested time, effort, or money into it, whether or not the current costs outweigh the benefits. Examples, we hang on to losing stocks, unprofitable investments, failing businesses, and unsuccessful relationships. This bias leads to a basic fallacy: that past investment should influence future decisions.

Anchoring Bias

Psychologists call it “the status quo bias” and economists call it “the endowment effect”. It is the tendency to value what we own more than what we don’t own. Beliefs are like private property and therefore the endowment effect applies to our belief system. The longer we hold a belief, the more we endow with value and the less we are likely to give it up. This is one of the reasons why there are religious zealots in all religions. It is also the reason why it is hard for us to sell a stock which has gone down in price.

Other Biases and Beliefs

Our beliefs are buffeted by a host of the above biases and some additional biases listed below:

Authority Bias: the tendency to value the opinion of an authority, religious leader, a guru or a baba or a politician.

Halo effect: the tendency for people to generalize one positive trait of a person to all the other traits of that person.

In group bias: the tendency for people to value the beliefs and attitudes of those whom they perceive to be fellow members of their groups.

Stereotyping or generalization bias: the tendency to assume that a member of a group will have certain characteristics believed to represent the group without having actual information about the particular member.

 There are many other biases like, bandwagon effect, consistency bias, expectation bias, negativity bias, self fulfilling prophecy and many more.

Thinking, Fast and Slow

Daniel Kahneman, an Israeli-American psychologist, who won the Nobel Prize in economics in 2002, has written a wise book (Thinking, Fast and Slow, 2013) for everyone who makes personal and business decisions. Our emotional and intuitive responses (what Mr. Kahneman calls our “fast thinking”) often lead us to make serious mistakes, especially in the field of investing. People tend to be over confident and over optimistic in their predictions. They systematically make biased judgments and too often follow the herd. They suffer from pride and regret and have asymmetric responses to gains and losses.  Slow thinking is more deliberative, logical and analytical. It avoids snap judgments and emotional instincts and allocates mental effort to the complexity of decisions. When decisions have to be made based on forecasts, it recognizes uncertainty and employs wide range of probability for the future events. The book is a masterpiece of insight into human mind.

Conclusion

The study of cognitive biases has revealed that humans are anything but rational individuals. We all have biases. And these biases influence our day to day judgments. A judge or jury assessing the evidence against a defendant, a CEO evaluating information about a company, a doctor making a medical diagnosis, or a scientist weighing data in favor of a theory or an investor buying or selling a stock will undergo the same cognitive temptations to confirm what is already believed. Having a bias does not make us a bad person. It’s not recognizing our biases that can lead to bad decision at work, in life, and in relationship. If we are aware of our biases and take time to make more deliberative (thinking slow) decisions, we are more likely to make better decisions.

Sources

1- Michael Shermer, The Believing Brain.  New York: Times Books, 2011

2-Daniel Kahneman, Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2013

3- Wikepedia.org/biases

RELATED ARTICLES

Most Popular

Prediabetes

Ten Moral Leaders: Jesus

Recent Comments