Kschang (which are his initials and his last name) is an IT professional and general manager in San Francisco CA.
What Is Confirmation Bias?
Confirmation bias is a universal problem that had been recognized as far back as the classical Greek period.
Recent research suggests that some people are more susceptible to it due to genetic and neurological factors. The problem with this bias is very few people recognize it, and when confronted, tend to claim they do not have it. However, when taken to extreme forms, they may account for radicalization, various phobias, and many other problems.
In this article, we will discuss the cause, the effects, and some coping mechanisms of confirmation bias.
Confirmation Bias Exists Everywhere
Confirmation bias is basically the mind taking shortcuts to choose to seek out and accept the information that fits your existing mental model or viewpoint, instead of all information available. In other words, your mind is biased to seek information in order to confirm your viewpoint.
Below are some examples.
Political pundits base their existence on confirmation bias. Political commentators like Arianna Huffington or Ann Coulter seem to be there to filter the world to their views, both left and right. People read them for confirmation of their own beliefs.
The Colbert Report was a satirical show on Comedy Central channel where Stephen Colbert satires conservative politics. Republican Mike Huckabee actually thanked Colbert for his support once when he was clearly being made fun of. It is confirmation bias at work. The liberals saw The Colbert Report as satire, while conservatives saw it as genuine praise. Each side only saw what they wanted to see.
Confirmation bias in finance leads to many problems, including the bandwagon effect, where people buy the "hot" stock, making it rise even further, as a confirmation of their belief that the stock will rise. This also leads to only listening to analysts who agree with your views and ignoring the rest who may be giving "better" advice.
When those stocks fall, that turns into a "sunken cost fallacy." The owner refuses to sell, hoping that those stocks will recover. They are basically ignoring the falling prices and remembering the tops and gains in the past.
Hypochondria and various other phobias (including paranoia) are basically negative confirmation bias, where the patient keeps on noticing negative clues and experiences that reinforce his or her negative expectations.
"Psychics" and mediums that claim to communicate with the dead at a "cold reading" usually make a huge number of ambiguous statements and reading various clues from the crowd or the subject to determine how to continue. Those who believe in the paranormal and the afterlife will recall much higher "accuracy" of the statements than there actually was.
In jury trials, researchers have found that jurors often have made up their mind very early in the trial, and are only looking for evidence to confirm his or her expectations. Therefore, all efforts are made to make sure the suspect is presented as innocent as possible in order not to bias the jury. For example, suspects are not to appear in court in prison garb at jury trials. It predisposes the jury to the idea that the suspect is guilty.
Scientific procedures are often designed specifically to combat confirmation bias, by performing double-blind trials and randomization, plus peer review (though peer review itself may be subject to the confirmation bias of the reviewers).
Affirmation is basically positive confirmation bias applied to oneself. Through presenting positive feedback to oneself, one hopes to reinforce their positive self-image.
Three Types of Confirmation Bias
Experts identified three types of confirmation bias: biased search, biased interpretation, and biased memory.
When searching for data, data that match expectations is included, while information that does not is skipped or assigned lesser importance. This is often known as "cherry-picking."
Information is reinterpreted to match expectations. Even if two people observe the same events, their interpretation can be completely opposite due to biased interpretation. Each only sees what each want to see.
Even if interpretation is neutral, the person in question may remember the event selectively to fit their expectations, by remembering items that do fit the expectations, and not remembering the items that do not. This is sometimes called "selective recall," or "access-biased memory." It is a less-conscious version of "biased search."
So where do these biases come from? They're in your head.
Various studies have shown that different areas of the brain receive information and process the information, then another area gathers the various inputs and makes a decision on which path to follow.
A study at Brown University (published in the Journal of Neuroscience) on 70 volunteers found that some people gave more weight to information that confirms their existing experiences while giving less weight to information that contradicts their existing experiences. In other words, they are predisposed by their genetics to have confirmation bias.
To be specific, two areas of the brain are affected: the prefrontal cortex, and striatum.
The prefrontal cortex is used to process and file second-hand information received, such as advice ("wear sunscreen"), while the striatum is used to process first-hand experiences ("I got sunburned, I should wear sunscreen"). When the brain decides to discount first-hand experience (striatum) in favor of second-hand advice (prefrontal cortex), that brain is said to have confirmation bias.
In the study, the volunteers were given a test on something they know nothing about but were given "hints" that were not always right. Later, they found that some hints they were given were wrong. If they continue to give the wrong answer (i.e. relying on the hint even though it was wrong), they have confirmation bias.
In people with the COMT gene variation, which affects how dopamine affects the prefrontal cortex, the participant with one variation was able to disregard the bad advice more often than the other variation. Let's arbitrarily call "ignore bad more" C-A and "ignore bad less" C-B.
In people with the DARPP-32 gene variation (which affects how dopamine affects the striatum), the participants with one variation learned the new experience faster but also got stuck with wrong advice that they learned for longer. Let's call "stuck wrong more" D-A and "stuck wrong less" D-B.
Thus, if you got the genes C-B and D-A, you are more prone to confirmation bias than the other three combinations.
When confirmation bias goes to the extreme, it causes many additional problems.
Polarization of Opinion
As each person's expectations are reinforced by additional confirmation, that person's attitude toward that expectation becomes more extreme and polarized. This often leads to serious conflict on controversial issues such as abortion, gun control, the death penalty, and such. This is known as radicalization.
The more radicalized a person becomes, the less they talk with "normal" people (they feel alienated and out of place) and more with the like-minded radicals, leading to further radicalization. It is a positive feedback loop. Terrorists often result from such radicalization and alienation.
Persistence of Discredited Belief
Even when one's belief was proven to be false, such belief was not completely erased from one's mind, but remained somewhere in the background in a lesser state, still influencing the decision process. In other words, one cannot completely cure oneself of confirmation bias except through time and conscious effort.
Furthermore, the mind can sometimes rationalize itself into loopholes, thus justifying something that is completely unjustifiable when viewed from the outside. Participants in pyramid schemes and cults often refuse to acknowledge that they are, even when confronted with the truth, such as the arrest of their "leader."
Preference for Early Information
Another form of confirmation bias gives more weight to earlier information in the decision process, even when the order was not important. The earlier information leads the brain to assess less weight toward the subsequent conflicting information. It is a form of biased interpretation.
Scams often work by presenting you with only good information, hiding the neutral or bad information. Even later, when you get the whole picture, the early "good" information is now stuck in your mind and somehow outweighs the later "bad" information, even though they should have cancelled out.
Illusory Association Between Events
illusory correlation is a tendency to see non-existent correlations in a set of data. In other words, to see a pattern where there is no pattern. This is another variation of biased interpretation, where non-corresponding or neutral data was discounted or ignored to form a correlation.
In extreme form, this leads to pronoia or paranoia: seeing positive or negative things that are not there.
How to Spot Confirmation Bias in Yourself (and Others)
Confirmation bias is not always bad. It can be a useful tool in some rare occasions, such as self-image therapy. However, in most cases, it is indeed a bad thing to have, esp. in extreme forms.
The easy access to information given by the internet (such as how you found your way to this hub) also means that even the craziest theories can often find some sort of supporting "evidence", such as the various satiric articles from The Onion website (many were often quoted as if they were real), and proliferation of various conspiracy theories.
Thus, to avoid confirmation bias, you must intentionally seek information supporting both sides. If you are researching "Is theory A true?", you will also need to research for "Is theory A false?" Then make a conscious decision to study the both sets of data, and to interpret them in a neutral manner as much as possible.
Furthermore, you must seek to verify and validate every bit of information you found. As Hemingway said long ago, you must have your own crap detector.
All good convincing arguments should explain issues on both sides. If the argument is completely one-sided, the argument is very likely showing confirmation bias.
- Dead But Not Buried: The Long-Term Consequences of Obama Handling the Birther Issue | HuffPost
President Obama’s disclosure of his long-form birth certificate should have been the end of a ridiculous controversy. It wasn't.
- Genes May Influence How Often People Follow Bad Advice | Discover Magazine
Researchers have found that whether people stick with advice they were given, even when their own experience contradicts it.
- Confirmation Bias - Definition & Examples | The Decision Lab
Confirmation bias describes our underlying tendency to notice, focus on, and provide greater credence to evidence that fit our existing beliefs.
- Confirmation bias in the utilization of others’ opinion strength | Nature Neuroscience
Humans tend to discount information that undermines past choices and judgments, having a significant impact on domains ranging from politics to science and education.
- Refuse to learn from experience? Thank your genes - Scientific American Blog Network
New findings suggest that a person's willingness to coolly consider the facts gleaned from their own experience—apart from others' previous verbal suggestions—might be based in large part on genetics.
This content is accurate and true to the best of the author’s knowledge and is not meant to substitute for formal and individualized advice from a qualified professional.
kschang (author) from San Francisco, CA, USA on June 16, 2012:
Haven't read that particular one, but his presentations on TED were very illuminating, and partly inspired this hub.
Mohan Kumar from UK on June 16, 2012:
This is a very erudite and effective summation of the aspects of confirmation bias. I like the way you've categorised the various aspects, given an inkling of the science and psychology of it and quoted very illustrative examples of confirmation bias. I am researching bias and decision making in medical narratives ( Both doctors and patients) and as a practising clinician am astounded by the number of events where this is at play( as I suppose it is in real life). Have you read Daniel kahnemann's seminal work on this ' thinking fast and slow' - if you haven't I highly recommend this tome. Very well put together. Kschang and thanks for sharing!
kschang (author) from San Francisco, CA, USA on February 28, 2012:
@Becca -- so the links I provided as examples are not sufficient as "references"? I guess I don't know what you're looking for. Are you looking for Wikipedia style citings?
Becca on February 28, 2012:
This would be a very useful source of information if you had provided any references!