Confirmation Bias - Definitions, Causes, Risks, Advantages & Debiasing
Updated: Feb 19, 2020
DEFINITION & HISTORY
Confirmation bias is our tendency to easily and readily confirm our pre-existing beliefs or hypotheses. We do this in one of three ways.
Firstly, while evaluating a belief, we may mistakenly search for the wrong kind of information, which doesn’t help us confirm or disconfirm the belief. E.g. Assume for a moment that you don’t know anything about cakes, and you’re told to verify the statement that all cakes are sweet. You would go out and try cakes, and it is likely that you would only find sweet cakes. After trying a number of different cakes, you would come to a conclusion that all cakes must be sweet. But a much better strategy is to search specifically for non-sweet cakes, as it doesn't’ matter how many sweet cakes you find, a single savory cake can disconfirm the rule (Nickerson, 1998).
Secondly, when we receive new information about a belief, our interpretation of this information is often influenced by what we already believe or want to believe. E.g. people tend to evaluate the performance of a government or a leader based on whether they voted for them or not (Nickerson, 1998).
Lastly, our memory may fail us and only provide us information that supports our present beliefs. Similarly, our attention may trick us by focusing only on information that supports our beliefs. These phenomena are referred to as selective memory and selective perception respectively (Nickerson, 1998).
An early example of historical recognition of this bias can be seen in the Italian poet Dante’s famous work, the Divine Comedy, wherein he noted - "opinion—hasty—often can incline to the wrong side, and then affection for one's own opinion binds, confines the mind". In modern psychology, confirmation bias was first described in the 1960s by P.C. Wason. In one of his most famous experiments, known as the Wason selection task, participants were presented four cards, each with a number on one side and a letter on the other, although only one side is seen at a time (Wason, 1968). Let’s consider the visible side of following cards: ‘L’, ‘8’, ‘I’, ‘3’, and a conditional rule that ‘If a card has a vowel on one side, it has an even number on the other side.’ The task for the participants was to identify those cards, and only those cards, which they would want to flip and see the other side, to evaluate whether the rule is valid. Take a moment to think about how many and which cards would you turn to verify the rule. The correct answer is to flip ‘I’, which must reveal an even number if the rule is true, and ‘3’, which must not reveal a vowel. The task is simple enough, when you think about it, but surprisingly, the success rate is only 20%. The most common errors are unnecessary inclusion of the even card and failure to choose the odd card (Dawson et al., 2002). The even card doesn’t reveal any useful information about the rule, as the rule doesn’t need to apply in reverse for it to be true. Whereas, the odd card can disconfirm the rule if the other side reveals a vowel.
Cognitive causes are the psychological mechanisms that explain the bias. It is likely that no one of the multiple explanations can explain every instance of the bias, and each explanation is valid in some cases and invalid in others.
First, the role of cognitive dissonance which is the discomfort we experience when there is a contradiction between what we believe and what we do or experience. If our experience supports our existing beliefs, we experience positive feelings, while if our experience contradicts our beliefs, we experience negative feelings. Hence, the preference for confirmatory information (Nickerson, 1998).
Secondly, some researchers suggest that we as humans are limited by time, motivation and mental resources. So we consider only one hypothesis at a time, and even for that hypothesis, consider only one possibility - either it being true or false, but not both possibilities at the same time. Once we are satisfied with our consideration of one possibility, we often do not proceed to the consideration of other possibilities. This pattern constricts open ended, exploratory thought, and enables a narrow, confirmatory way of thinking (Nickerson, 1998).
Next, the view that confirmation bias is learned through observation and socialization, beginning from early childhood. In most cultures, significant importance is given to defending and rationalizing one’s beliefs and actions, but not on reasons that could be given against our beliefs (Nickerson, 1998).
And finally, the view that belief is the default state of mind, while disbelief tends to be the exception rather than a rule. Researchers have observed that in the absence of compelling evidence one way or the other, people are more inclined to assume that a statement is true than to assume that it is false. Even asking people to imagine or explain why a random hypothesis might be true, makes them more likely to believe that it is true (Nickerson, 1998).
Striatum and prefrontal cortex have been observed to be involved in confirmation bias: Striatum is the part of the brain which helps in learning the rewards or risks of behaviors and validity of beliefs by keeping a tab of negative and positive cases in our experience. Pre-frontal cortex is the part of the brain which remembers and maintains our beliefs. Ideally, we would expect only one direction of communication between the two parts - Striatum, based on evaluation of evidence, should inform PFC so that false beliefs are rejected and valid ones are adopted and maintained. But communication in the other direction has also been observed, wherein the beliefs in PFC modify the learning process in Striatum such that it favors pre-existing beliefs while evaluating evidence (Doll et al., 2011).
For the bias to be passed down genetically or culturally to us from our ancestors, it must be beneficial in certain conditions.
In many cases, learning by accumulating evidence from our own experience is difficult. E.g. in situations where the risks and rewards may be too far in the future such as rewards of long-term savings or risks of smoking cigarettes. In such cases, learning by adopting fully-formed beliefs from external sources is more advantageous than learning from own experience. This strategy would have been especially successful in cooperative groups, such as tribes, in which our early ancestors lived. In such groups, there would be very little reason to doubt the intentions of other group members.
Another view is that evolution has not geared us towards testing of hypotheses in accordance with the highest standards of science and logic, but rather towards survival, and survival requires us to identify potential rewards and avoid costly errors. So when we perceive the possibility of an attractive reward or a costly error, we overtly focus on it and try to ascertain if our expectations are indeed valid (Nickerson, 1998).
And finally, some researchers have also observed that socially skilled persons use a confirmation strategy in social situations, wherein they form early hypotheses about the personality traits of new individuals they meet, and then ask matching questions to confirm the hypotheses (Dardenne & Leyens, 1995).
When information is presented sequentially over time, people tend to form beliefs based on early information and then confirm those beliefs with information presented later, thus giving more importance to first impressions. This is called the primacy effect, concurring with the old wisdom - that first impression is the last impression (Nickerson, 1998).
Secondly, once a belief or opinion has been formed, it can be very difficult to change, even in the face of strong evidence that it is wrong. This is the reason why people’s political worldviews, religious beliefs and prejudice against communities often don’t change their entire life, in spite of encountering contradictory information (Nickerson, 1998).
Next, social media platforms show individuals information that they are more likely to agree with, while excluding opposing views, thus amplifying confirmation bias. People also tend to limit their information sources to those that support their pre-existing beliefs. This leads to attitude polarization, where people sit on extreme ends of opinions, without any middle ground.
Lastly, people tend to have exaggerated beliefs about their own abilities, a tendency known as overconfidence bias. Confirmation bias amplifies overconfidence as people subjectively interpret new information to support their biased and exaggerated self-beliefs.
While it is difficult to suppress or eliminate confirmation bias, researchers have found that individuals can learn the value of looking for potentially disconfirming evidence. To strengthen this learning process, we can promote, through education and other cultural means, a general skepticism in the society such that people do not accept the validity of any statement without some further thought and investigation (Dawson et al., 2002).
Next, the only condition where people push themselves to consider multiple points of view and anticipate all possibilities and combinations is when they know in advance that they will need to explain themselves to others who are well-informed. Therefore, promoting a culture of debates and discussion in public forums can nudge people towards more exploratory thinking, instead of confirmatory thought (Lerner & Tetlock, 1999).
Another approach points to the fact that when new information regarding a pre-existing belief is presented sequentially, people give more importance to initial information. But when the new information is presented simultaneously, people concentrate on comparing, evaluating, and integrating the new pieces of information, which leads to less focus on existing belief and reduced confirmation bias (Hogarth & Einhorn, 1992).
Lastly, confirmation bias allows a shallow analysis of new information, wherein contradictory evidence is casually dismissed, rather than seriously analyzed and integrated. If people are made to pay more attention to the new information, confirmation bias is reduced. E.g. one of the ways to make people pay more attention to information is by making it harder to read, so more effort is required to process the information (Hernandez & Preston, 2013).
Doll, B. B., Hutchison, K. E., & Frank, M. J. (2011). Dopaminergic genes predict individual differences in susceptibility to confirmation bias. Journal of Neuroscience, 31(16), 6188-6198.
Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of general psychology, 2(2), 175-220.
Hernandez, I., & Preston, J. L. (2013). Disfluency disrupts the confirmation bias. Journal of Experimental Social Psychology, 49(1), 178-182.
Dardenne, B., & Leyens, J.-P. (1995). Confirmation Bias as a Social Skill. Personality and Social Psychology Bulletin, 21(11), 1229–1239.
Stanovich, K. E., West, R. F., & Toplak, M. E. (2013). Myside Bias, Rational Thinking, and Intelligence. Current Directions in Psychological Science, 22(4), 259–264.
Dawson, E., Gilovich, T., & Regan, D. T. (2002). Motivated Reasoning and Performance on the was on Selection Task. Personality and Social Psychology Bulletin, 28(10), 1379–1387.
Lerner, J. S., & Tetlock, P. E. (1999). Accounting for the effects of accountability. Psychological bulletin, 125(2), 255.
Wason, P. C. (1968). Reasoning about a Rule. Quarterly Journal of Experimental Psychology, 20(3), 273–281.