Background
In recent history mass vaccination has proved essential to dealing with pandemics. However, the effectiveness of a vaccine depends on the number of people willing to take it. One approach to encouraging uptake is to publish information about safety and effectiveness. But confirmation bias research in other domains suggests that people may evaluate this information through the lens of their existing beliefs.
Methods
This study used a simple 2 × 2 design to investigate whether people’s (n = 3899) existing beliefs influenced their ability to correctly evaluate data from a fictional trial presented in a frequency table. Treatment groups saw different trial outcomes (intervention effective versus ineffective and trial related versus unrelated to vaccines).
Results
Results provided robust evidence for confirmation bias in the domain of vaccines: people made systematic errors (P < 0.01) when evaluating evidence that was inconsistent with their prior beliefs. This pattern emerged among people with both pro-vaccination and anti-vaccination attitudes. Errors were attributed to confirmation bias because no such differences were detected when participants evaluated data unrelated to vaccines.
Conclusions
People are prone to misinterpreting evidence about vaccines in ways that reflect their underlying beliefs. Confirmation bias is an important consideration for vaccine communication.
Confirmation bias is defined as searching for and assimilating information in a way that favours existing beliefs. We show that confirmation bias is a natural consequence of boundedly rational belief updating by presenting the BIASR model (Bayesian updating with an Independence Approximation and Source Reliability). Upon receiving information, an individual updates beliefs about the hypothesis in question and the reliability of the information source simultaneously. In this model, an individual's beliefs about a hypothesis and the source reliability form a Bayesian network. This rational updating introduces numerous dependencies between beliefs, the tracking of which represents an unrealistic demand on memory. Our model takes this into account by including realistic cognitive limitations proposed in prior research. Specifically, human cognition can overcome this memory limitation by assuming independence between beliefs. We show how a Bayesian belief updating model incorporating this independence approximation generates many types of confirmation bias, including biased evaluation, biased assimilation, attitude polarisation, belief perseverance and confirmation bias in the selection of sources.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.