Implicit bias and how our attention, memories, and decisions are so strongly affected by processes beyond our control has intrigued me the most of all the topics we’ve discussed this semester. While discussing problem solving, we learned about confirmation bias. Confirmation bias concerns how we handle evidence related to our beliefs. When we evaluate our beliefs, we’re likely to encounter both confirming and disconfirming evidence. A logical approach would be to consider both types of evidence equally valuable, with some researchers suggesting that we should give disconfirming evidence more thought because it is likely to provide more information. Confirmation bias, instead, compels us to be more sensitive to confirming evidence supporting our existing beliefs and to neglect disconfirming evidence that may oppose them.
When discussing confirmation bias in class, Dr. Rettinger mentioned that the best way to overcome confirmation bias and belief perseverance (the tendency to ignore even undeniable disconfirming evidence) was by appealing to people’s emotions and engaging “System 1” (automatic, emotional) resources. This intrigued me, so I set out to find some more information about how to combat confirmation bias. Additionally, I was interested in how to make myself more aware of and resistant to my own biases.
The first article I read from BBC.com, “How to Get People to Overcome Their Bias” discussed an experiment conducted at Princeton that explored two theories of confirmation bias and the effectiveness of different strategies to overcome confirmation bias arising from each. Experimenters believed there are two accounts for why confirmation bias happens. The motivational theory of confirmation bias suggests that people are biased because of their motivations: their job, friends, desires, or how they see themselves. To defeat motivation-based biases, we would need to change people’s motivations. The second theory, the cognition theory, suggests that people are biased because they lack effective strategies for evaluating new evidence. To defeat cognition-based biases, we need to give people better methods for considering evidence.
To test these theories, researchers presented study participants with information supporting the death penalty (i.e. the death penalty lowers murder rates). Participants either strongly supported or strongly opposed the death penalty. In the first part of the experiment, they simply presented the evidence and measured how participants’ beliefs changed. Not surprisingly, those who already supported the death penalty felt stronger about their beliefs after seeing the evidence. However, those who opposed the death penalty also felt stronger in their opposition even after seeing evidence supporting the death penalty …belief perseverance in action. The researchers referred to this as biased assimilation.
In a follow-up study by the same researchers, they tested two methods for assimilating new evidence. One group was instructed to be “objective and unbiased” and imagine themselves “as a judge or juror asked to weigh all of the evidence in a fair and impartial manner.” This approach was testing how changing an individual’s motivations would impact the assimilation of new evidence. The second group was told to consider the methodology of the study producing the evidence that the death penalty lowered murder rates and then imagine the results had instead supported the opposite finding (death penalty = higher murder rates) This strategy was referred to as “consider the opposite” and was testing whether changing participants’ cognitive processes for evaluating evidence could affect beliefs.
They found that the strategy to change people’s motivations did not work and results were the same as in the initial study. However, asking people to “consider the opposite” did result in participants overcoming biased assimilation: evaluating new evidence fairly even if it didn’t support their existing beliefs and not becoming more extreme in their existing views. The article concluded that the study demonstrated that wanting to be objective isn’t enough to overcome confirmation bias and instead we must learn better cognitive strategies for scrutinizing new information.
While I feel the information gained from this study could be useful for helping me to overcome my own confirmation bias, I’m skeptical whether this could work with others. When I have encountered confirmation bias, it appears to be strongest during debates on highly controversial topics that people feel personally invested in (topics such as evolution vs. creationism, climate change, gun control, etc.). I believe both motivational and cognitive strategies as mentioned above would fail because people simply do not want to see things any other way. Outside of a lab, I believe the lack of motivation to consider evidence fairly would negate any ability of the “consider the opposite” approach to overcome biased assimilation. While the research findings shouldn’t be discarded as irrelevant, I also feel it highlights how appealing to emotion and motivations is necessary in more real-life situations.
With that in mind, I read a second article from Pacific Standard online titled “Changing Anti-Vaxxers’ Minds.” This article discussed studies done to investigate the effect of disconfirming evidence in different forms on people opposed to vaccination. In one study, participants were either asked to read evidence debunking links between vaccines and autism, read about bird feeders, or read about a child with measles and look at pictures of children with vaccine-preventable diseases. This first study found that reading about vaccines and autism had the same effect as reading about bird feeders (none) while reading a story about a child with measles and see other photos of sick children made anti-vaxxers slightly less skeptical and more in favor of vaccinating. However, a second similar study found that both test conditions (refuting vaccine-autism connections and stories/images of sick children) resulted in stronger anti-vaccination beliefs.
I believe the second study does indicate that appealing to emotions shows some promise if we wish to overcome confirmation bias, but I believe it also emphasizes just how strongly we cling to our beliefs. From my own, non-scientific experience, it seems like the personal attachment people feel to their beliefs results in defensiveness whenever disconfirming evidence is encountered. I’ve felt this in myself and have to work hard to remind myself that my identity and personal values aren’t attached to the evidence I choose to believe in. However, I realize confirmation bias is strongest when people feel they’ve invested more effort into believing something and defending that belief. I think some people feel as though considering disconfirming evidence or changing their opinions is like abandoning a figurative ship and not being strong enough to maintain conviction. I wonder if the key to overcoming confirmation bias is not to continue seeking ways to make people better at evaluating evidence, but instead making people more aware that confirmation bias exists and how it’s keeping them from having a truly educated opinion. If people feel they’re being misled by their own unconscious tendencies, would they then feel more compelled to seek out new evidence on their own? I know I do.
If you’re interested in reading the articles I mentioned, you can click on the titles of the articles above or follow the links below:
How to Get People to Overcome Their Bias: http://www.bbc.com/future/story/20170131-why-wont-some-people-listen-to-reason
Changing Anti-Vaxxers’ Minds: https://psmag.com/social-justice/changing-anti-vaccine-minds