Confirmation Bias (Part One) - A Lot of What You Believe is Wrong


I have often found it to be the case that the humanities have long observed what science is only later able to explain in greater detail – personal beliefs are hard to extinguish, even in the face of evidence that they are untrue. Fifth Century Greek Historian Thucydides wrote, “it is a habit of mankind…to use sovereign reason to thrust aside what they do not fancy.” Sir Francis Bacon posited that, “The human understanding when it has once adopted an opinion…draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects or despises, or else by some distraction sets aside or rejects.”

But perhaps no one has written on the topic as lucidly as Leo Tolstoy who wrote, “I know that most men – not only those considered clever, but even those who are very clever, and capable of understanding most difficult scientific, mathematical or philosophical problems – can very seldom discern even the simplest and most obvious truth if it be such as to oblige them to admit the falsity of conclusions they have formed, perhaps with much difficulty – conclusions of which they are proud, which they have taught to others, and on which they have built their lives.” Simply put, once we have formed an opinion, especially one central to our worldview, we cling tenaciously to that opinion even when the facts may speak to its wrongheadedness.


The 2004 Presidential campaign pitted the incumbent President George W. Bush against the Democratic challenger John Kerry. It also provided an opportunity for brain researchers to study the science of what makes belief so “sticky.” Researchers began by gathering candidates who professed have a well-defined presence for one candidate over the other. Participants were given seemingly contradictory statements either from President Bush, Senator Kerry, or a third politically neutral public figure. They were also given further information that made the seeming contradiction appear more plausible. They were then asked to determine whether or not the individuals in questions had in fact made statements that were inconsistent.

During the evaluation and thought process, research participants were monitored inside a magnetic resonance imaging (MRI) machine that allowed the scientists to observe their brain activity. As subjects evaluated seeminglycontradictory statements made by their less-preferred candidate, the emotional centers of their brain remained inactive. This allowed them to make cold, rational judgments about these statements. However, as the subjects evaluated the statements of their preferred candidate, the emotional centers of their brain became highly aroused. When the results were tallied, there were clear differences in the evaluations.

Subjects were very likely to endorse their less preferred candidate as having said something contradictory and were highly unlikely to say the candidate of their choice had made such a rhetorical error. Simply put, when their guy said something incorrect, their emotions drown it out, but when “the other guy” said something implausible, they rationally pointed out the fallacious thinking. The inference we can draw from this experiment is that the emotional tumult created when facts collided with logic fell down on the side of emotion. Belief is an emotional construct, we feel it deeply, and are loathe to let silly facts tell us that we believe is not so. This emotional reaction created a situation whereby the status quo was maintained and previously held beliefs were confirmed, even when logic would have suggested otherwise. In simple terms, we rationally evaluate things that do not intersect with our worldview and emotionally evaluate those that do. 


Hopefully by now you’re convinced by now that questioning some of your deeply held assumptions, while difficult, is a worthwhile undertaking. Rene Descartes, considered by many to be the father of Modern Philosophy did something similar when he sought out to question everything he thought he know, famously arriving at the one thing he could know for certain, “Cogito ergo sum (I think therefore I am).”

Granted, Descartes exercise is a little impractical, but it does set a positive precedent in some respects. So much of what we assume we “know” has been passed down to us somewhat uncritically. We’ve accepted the lessons we’ve learned from our parents, teachers and the other salient personal and cultural influences that have been instrumental in shaping our worldview. Would it really be so bad to give some of that a second look – leaving what we no longer found to be true and newly embracing timeless truths?

The Challenge: Ask yourself today - what "untrue-isms" might I be holding on to by evaluating them emotionally rather than logically?

To learn much more about the intersection of mind and markets please check out The Laws of Wealth by Dr. Daniel Crosby - HERE