Confirmation Bias (Part One) - A Lot of What You Believe is Wrong


I have often found it to be the case that the humanities have long observed what science is only later able to explain in greater detail – personal beliefs are hard to extinguish, even in the face of evidence that they are untrue. Fifth Century Greek Historian Thucydides wrote, “it is a habit of mankind…to use sovereign reason to thrust aside what they do not fancy.” Sir Francis Bacon posited that, “The human understanding when it has once adopted an opinion…draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects or despises, or else by some distraction sets aside or rejects.”

But perhaps no one has written on the topic as lucidly as Leo Tolstoy who wrote, “I know that most men – not only those considered clever, but even those who are very clever, and capable of understanding most difficult scientific, mathematical or philosophical problems – can very seldom discern even the simplest and most obvious truth if it be such as to oblige them to admit the falsity of conclusions they have formed, perhaps with much difficulty – conclusions of which they are proud, which they have taught to others, and on which they have built their lives.” Simply put, once we have formed an opinion, especially one central to our worldview, we cling tenaciously to that opinion even when the facts may speak to its wrongheadedness.


The 2004 Presidential campaign pitted the incumbent President George W. Bush against the Democratic challenger John Kerry. It also provided an opportunity for brain researchers to study the science of what makes belief so “sticky.” Researchers began by gathering candidates who professed have a well-defined presence for one candidate over the other. Participants were given seemingly contradictory statements either from President Bush, Senator Kerry, or a third politically neutral public figure. They were also given further information that made the seeming contradiction appear more plausible. They were then asked to determine whether or not the individuals in questions had in fact made statements that were inconsistent.

During the evaluation and thought process, research participants were monitored inside a magnetic resonance imaging (MRI) machine that allowed the scientists to observe their brain activity. As subjects evaluated seeminglycontradictory statements made by their less-preferred candidate, the emotional centers of their brain remained inactive. This allowed them to make cold, rational judgments about these statements. However, as the subjects evaluated the statements of their preferred candidate, the emotional centers of their brain became highly aroused. When the results were tallied, there were clear differences in the evaluations.

Subjects were very likely to endorse their less preferred candidate as having said something contradictory and were highly unlikely to say the candidate of their choice had made such a rhetorical error. Simply put, when their guy said something incorrect, their emotions drown it out, but when “the other guy” said something implausible, they rationally pointed out the fallacious thinking. The inference we can draw from this experiment is that the emotional tumult created when facts collided with logic fell down on the side of emotion. Belief is an emotional construct, we feel it deeply, and are loathe to let silly facts tell us that we believe is not so. This emotional reaction created a situation whereby the status quo was maintained and previously held beliefs were confirmed, even when logic would have suggested otherwise. In simple terms, we rationally evaluate things that do not intersect with our worldview and emotionally evaluate those that do. 


Hopefully by now you’re convinced by now that questioning some of your deeply held assumptions, while difficult, is a worthwhile undertaking. Rene Descartes, considered by many to be the father of Modern Philosophy did something similar when he sought out to question everything he thought he know, famously arriving at the one thing he could know for certain, “Cogito ergo sum (I think therefore I am).”

Granted, Descartes exercise is a little impractical, but it does set a positive precedent in some respects. So much of what we assume we “know” has been passed down to us somewhat uncritically. We’ve accepted the lessons we’ve learned from our parents, teachers and the other salient personal and cultural influences that have been instrumental in shaping our worldview. Would it really be so bad to give some of that a second look – leaving what we no longer found to be true and newly embracing timeless truths?

The Challenge: Ask yourself today - what "untrue-isms" might I be holding on to by evaluating them emotionally rather than logically?

To learn much more about the intersection of mind and markets please check out The Laws of Wealth by Dr. Daniel Crosby - HERE


Asset Managers Need Enemies

Gentle Reader - We may not be formally acquainted, but I think that I know a bit about you. Read my description of you below and tell me how accurately it describes your personality: 

“Although others may see you as put together, inside you are worrisome and insecure. You want to be admired by others and you think about this when making decisions. Although you may not have done big things yet, you feel like that day will come. You feel as though you have a lot of untapped potential. You’re an independent thinker who thoughtfully considers ideas before accepting them. You enjoy a certain amount of variety and change and dislike being restrained by restrictions or limitations. You know you’re not perfect, but you are typically able to use your personality strengths to compensate for your weaknesses.”

So, how did I do? On a scale from 1 to 5, with 5 being the most accurate, how accurately would you say I described your personality? If you’re like most people, you probably ranked that description of you as a 4 or 5, which likely puzzled you since we’ve never met. The paragraph above illustrates what is called “Barnum Effect” or alternately, “Fortune Cookie Effect”. Barnum Effect is named for P.T. Barnum, the great entertainer and circus magnate.

Barnum famously posited that “There’s a sucker born every minute” and used his knowledge of how to sucker people to get them to part with their money. Barnum’s understanding of suckers, though born under the big top, undoubtedly surpasses that of many formally trained academicians. P.T. understood what psychologists call “confirmation bias” or the human tendency to look for information that reinforces ideas we already hold.

When we receive feedback about ourselves there are two simultaneous dynamics that make up the broader phenomenon of confirmation bias. The first of these is “self-verification” which is the tendency to reinforce existing beliefs. The second is “self-enhancement” whereby we attend to information that makes us feel good about ourselves. The function of these two dynamics is clear – to maintain our self esteem and feelings of confidence. In general this is a positive, after all, who doesn’t want to feel about themselves? However, these dynamics work in overdrive in a number of instances – including when our deeply held-beliefs are challenged or our self-esteem is challenged. Confirmation bias becomes problematic when it leads us to maintain the status quo in the face of disconfirmatory information or overlook realistic, negative feedback about ourselves. In these instances, our need to feel competent can cause us to ignore warnings and make bad financial decisions that privilege ego at the expense of making money. 

It is for these very reasons that Daniel Kahneman engages in what he calls "adversarial collaboration", basically, doing work with people that disagree with him (enemies, if you're feeling dramatic). The Nobel Laureate says:

"I got into adversarial collaboration because there is a system in the scholarly literature where people critique other people's writings, and then there is is a rejoinder. That's the routine in scientific publications. I was just struck by how totally wasteful this is, because in all these exchanges nobody admits to having made an error...It's just foul actually." 

Kahneman understands that his ego - as in-check as it may be - is still keeping him from learning and growing. From testing his assumptions and considering all angles. Asset managers, whose sole job is to make good decisions, would be well-served by such an exercise in humility. When running money, keep your friends far from you and your enemies close. 

Want more great content on the intersection of mind and markets? Check out The Laws of Wealth by Nocturne Capital founder Dr. Daniel Crosby - HERE