Critical Thinking and Learning Site

Confirmation Bias


Confirmation Bias in everyday life.

Have you ever been to a discussion wondering “How do these people believe that, haven’t they read the news?” or “How do you say that, just google it and you will see that things are not like that.” The chances are that most probably you’ve previously been in such a situation. This discussion could be about everything, from politics and religion to who is the best football player or team.

In every occasion, we are sure about our opinions and our beliefs, and we consider ourselves well informed. Moreover, we cling to our ideas and a lot of the times defend them like there is no tomorrow. We rarely consider changing our minds, especially to things we support strong and which means a lot to us (e.g. you rarely could see someone change opinion about abortion or death penalty). But how sure are you that you are right? How sure are you that your opinion is valid? How sure are you that you are indeed well informed about the subjected. Of course, you might be, but here comes the Confirmation Bias.

What we believe

That our opinions are strong, we have the information and the arguments to support them, and they are entirely logical. If people oppose our views,  they are either idiots, or they have conducted no sufficient research on the subject.

What is the reality

Particular information biases our opinions. We seek the information that confirms our beliefs and tend to ignore everything that refutes our opinion.

Confirmation bias refers exactly to this kind of misconception.

This is huge! By pinpointing this bias, you may radically change the way you interpret things, thus the way you evaluate situations, and finally, the way you think.

Confirmation bias is the inclination to seek for, rely on, draw conclusions from, or favor information that confirms our already formed beliefs or opinions, paying less attention to, or sometimes completely ignoring, all the evidence and information supporting the opposite.

That means that when people examine their beliefs, they only look for the evidence supporting that belief, they do not seek for the objective reality or truth. Besides, if a person receives 100 articles to read and 95 of them are from a high-quality source rejecting his/her opinion, and the 5 of them would be from a non-reputable source but confirming the individual’s opinion, he will stick to those 5 articles ignoring the other 95.

We form an opinion, then focus our efforts on trying to substantiate that view, without bothering to evaluate, or even deliberately disregarding, all sources and evidence that diverge from our conclusion.

Where did all that begun and an interesting experiment *

*If you consider the article long, you can go to the last paragraph to get a simple advice about how to avoid confirmation bias in everyday life. But I would suggest reading the whole article; it wouldn’t take more than 10 minutes.

Back in the 60s, some experiments indicated that people are biased when confirming their opinions. Perhaps the most characteristic was Peter Wason’s experiment, an example designed to examine how people test a hypothesis. An English psychologist, Wason provided a series of three numbers, e.g. 2-4-6, and asked people to figure out in what sequence he had written the three numbers. Most answers contained three even numbers in ascending order (such as 8-10-12). More impressionable minds came up with 1-3-5, or -2,-4,-6.

All those answers were correct, but none of them followed Wason’s sequence, which merely required three ascending numbers. The issue with this experiments is that people didn’t try to oppose their initially formed opinion by making other hypotheses. They only assumed the sequence demanded an increase by two at a time and added numbers that corroborated their theory. People do not try to oppose their opinion, rather confirm it. They do not critically, rationally think.

A second interesting experiment

In his similar Selection Task test, Wason gave the participants particular information about a set of objects, asking them what further information would they need to determine if a certain rule applies (of the type “if A, then B”). In this test and most of its variation, people fall victims to the confirmation bias, seeking information that validates the rule and dismissing evidence that may disprove it.

This bias emerges stronger when the issues at stake are emotionally charged, the beliefs very deeply rooted into people. If asked to debate about the existence of God, a pious man will probably start listing all cases of divine revelation, whereas an atheist will focus on that evidence that denies all possibility of a higher power’s existence. Neither is likely even to consider any source that contradicts their beliefs. It is amazing how people keep on searching for the information that confirms their beliefs and ignores all the rest.

Various forms of Confirmation Bias

The confirmation bias appears in varied forms. One of them is known as attitude polarization. This form is what happens when we have huge disagreements in a group, invoked by the same kind of information; this same information can move even further apart the two opposing views.

Then, there is the illusory correlation, i.e., when people tend to find a connection between two events or situations that are not linked.

And finally, the irrational primary effect, which refers to the situation when we add more weight in the early information we have concerning an event, and less weight in the process.

How can confirmation bias affect?

In general, confirmation bias can lead to overconfidence, causing investors to ignore evidence. This overconfidence leads to an almost blind faith in beliefs created even in the face of contradictory information, sometimes with considerable monetary losses.  On the contrast,  investors who are more thoughtful and resist this bias produce more precise, reliable results, usually with the attendant financial profit.

For years before scientific medicine was discovered, doctors believed that a recovery after medical treatment was due to the intake of the medicine, not examining or ignoring that other parameters might have played an equally important role in the .” of the patient.

More specifically, the are three different ways the confirmation bias influences us, and all of them have to do with how we search, interpret, and recover from memory information.

The first is the biased search of information.

We are biased in the way we seek information. Although we may seem that we look for information in an objective way, we tend to test hypotheses and ask questions in a way that confirms our beliefs. The way we phrase a question may substantially alter the result of research; for example when people are asked “Are you happy with your social life?”, they tend to answer more positively than when asked: “are you unhappy with your social life.”

In a similar test, participants were called to award the custody of a child to parent A or parent B, given the information that parent A is somewhat capable of being appointed guardian, and parent B, although he maintained a good relationship with the child, his professional obligations would keep him occupied. The phrasing of the question influenced the answers: Parent A was deemed more suitable when the question was “Which parent should have custody of the child?”, whereas, when asked, “Which parent should be denied custody of the child?”, most people chose parent B. The difference was that, in the first case, influenced by the phrasing of the question, people were looking for positive characteristics in parent A, whereas in the second case, they were looking for negative characteristics in parent B.

Another interesting example is how biased job interviews are. When someone interviews another person, they have already read the CV and have formed a first impression of whether the person is suitable for the job or not. As a result, the questions asked during the interview tend to confirm the previous belief. A way to solve that problem would be to have the same questions for all the candidates, or -even better- to have a pool of questions, some of which will be posed to the candidate at random, preferably by different interviewers. Even then the best way is to check their previous achievements. You can’t figure out a person’s abilities by a simple interview.

Another important part is selective exposure. Selective exposure when people search for information that supports their beliefs. The interesting part is that people with higher confidence tend to seek for arguments that oppose their view to strengthen their opinion. Οn the contrary, less confident people tend to look for the information that supports their views.

The second is biased interpretation.

This part is when two people or parties have the same information, so the biased search of information does not factor in this case, yet they come to different conclusions due to the biased way they interpret the information.

An experiment conducted at Stanford University involved half participants supporting the death penalty, and the other half opposing it. All read a short description of two fictional studies: one describing the USA with and without the capital punishment, the other comparing individual state murder rates before and after the establishment of the death penalty. Both studies were fictional. First, participants read both studies and then were asked whether they had changed their minds. They then proceeded on to read a more thorough description of each study’s procedure, and they have been invited to evaluate if the study was convincing and carefully conducted.

Example from Wikipedia

“The participants, whether supporters or opponents, reported shifting their attitudes slightly in the direction of the first study they read. Once they read the more detailed descriptions of the two studies, they almost all returned to their original belief regardless of the evidence provided, pointing to details that supported their viewpoint and disregarding anything contrary. Participants described studies supporting their pre-existing view as superior to those that contradicted it, in detailed and specific ways.Writing about a study that seemed to undermine the deterrence effect, a death penalty proponent wrote, “The research didn’t cover a long enough period of time,” while an opponent’s comment on the same study said, “No substantial evidence to contradict the researchers has been presented.


The results indicated that people set higher standards for and are more demanding with claims that contradict their opinion, an effect termed “disconfirmation bias.”

The important thing to note here is that IQ, or how clever someone is, does not play a role in how biased they are. Neither is interpretation bias connected only to emotionally or ethically challenging matters.

Participants in a vehicle safety experiment had all read the same information about car safety, after having taken the SAT exams as an assessment of their intelligence. When asked if they would allow a dangerous German car on American streets or a dangerous American car on German streets, they responded that the former should be removed from the streets sooner than the latter.

In another experiment, when people asked to evaluate the evidential significance of the statements supporting and denying a person’s responsibility for theft, people considered the acquitting statements more meaningful when they believed the person to be innocent. Instead, they gave more credit to the convicting statements when they had formed the opinion the person is guilty.

The third is biased memory

Even if we succeed on gathering unbiased information, and interpret it in an unbiased way, our biased memory might lead us to flawed conclusions. We might recall all the evidence, knowledge memories that support our belief. We call this way of confirmation bias influence   “selective recall” or “confirmatory memory.”

A study conducted on the subject required for the participants to first read a woman’s profile, keeping in mind some of her introvert and some of her extrovert characteristics. They were then divided into two groups, one of them given the task to evaluate the woman’s suitability for a position as a librarian, the second one as a real estate agent. It was observed that the group considering the woman for a librarian’s job was able to provide more examples of the woman’s controversy, whereas the real estate group recalled more cases of the woman’s extrovert behavior.

People also seem to selectively retrieve into memory examples of their own introvert and/or extrovert behavior, depending on which personality type they approved most of at the time. In a relevant experiment, participants seemed to recall more introvert examples when told that introvert people were considered more successful. Instead, if told that extrovert behavior leads to success, more personal incidents of extrovert behavior came to mind.

It has been theorized that the memories that tend to confirm our thoughts come easier to mind (schema theory). On the contrary, it is also speculated that the memories which come from surprising, unexpected evidence tend to linger in mind, exactly because of their striking element.

Related effects

Polarization of opinion

When people with opposing views interpret new information in a biased way, their views can move even further apart. We call this process “attitude polarization.”

Persistence of discredited beliefs

In this case, confirmation bias is the cause of people not changing their mind, upon elimination of the initial information. Experiments have been conducted, where subjects were given false information with the aim to create an opinion, their attitude is measured, then they are informed about the fakery of the information, and they measure their position again to check if they return to the original level.

Another experiment required participants to read the evaluation of two firefighters, as well as their results to a risk aversion test. The information was once again adjusted accordingly by the experimenters, focusing either on a positive, or a negative quality: some participants read about a risk-taking firefighter performing well, whereas others were informed the firefighter did not do as well as a more cautious colleague of his. The data was falsified, but, even if the studies had been real, there would be no sufficient evidence to evaluate firefighters as a whole. Even after the revelation that the studies were fictional, participants acknowledged the falsehood but considered the admittedly disproved information irrelevant to their personal opinions.

Preference of early information

Preference of early information is the event where people subconsciously put more weight in the information they perceived earlier on in a series, even when the order is unimportant.

When someone is portrayed as “smart, hard-working, judgemental, envious,” people tend to focus on his positive attributes rather than his negative ones, only because the former are lined up first in the description. This tendency is known as the irrational primacy effect, and should not be confused with the memory primacy effect, which explains how we are able to recall earlier items in a series more easily. Confirmation bias, in this case in the form of interpretation bias, explains that we are affected by the initial information we receive, forming an opinion and interpreting all following information accordingly.

How to avoid confirmation bias.

Search for disconfirming evidence. Try to find information that contradicts your opinion, try to be in the position of the person supporting the opposing argument.  In simple words, try to reject your own beliefs. 




About the author

By Plato
Critical Thinking and Learning Site

Latest Stories