When facts are not enough


UMaine Professor Robert Wheeler asked me to present on science communication and risk perception to his Infectious Disease class. Because of Dr. Wheeler’s expertise and other topics covered in the class, I used the case of vaccines and autism to illustrate my points. The following is a summary of the lecture.

Science has evolved to generate consensus about what is true and what is not. A wealth of scientific truths have been generated and transferred to the public in the last half-century. Yet neither public support for research nor scientific literacy has increased significantly over the same time period. Despite the best efforts of science writers like myself and the new generation of scientist-communicators, more and better information does not = knowledge.

Why?

A lack of understanding about how science works is partly to blame for people believing things that science says are not true, but that’s not the whole story.

Even when people have the right information, they don’t always make the decision that scientists would favor. In the last decade or so, we have learned from psychologists and communication scientists that people view information through the “frames” of their values and beliefs. They trust messengers and seek out sources of information with whom they already feel aligned: friends, relatives, neighbors, pastor, political party, etc.

While this has probably always been true, a concurrent trend is magnifying the issue: the unprecedented access that we all have to information, and misinformation.

As I mentioned in the beginning of this post, I am using the example of autism to illustrate how people maintain false beliefs in the face of facts.

Autism is a sensitive and serious topic. The dramatic increase in children with autism spectrum disorders is very real and very scary. The latest risk factor, tracked by the CDC, is 1 in 110. While increased understanding and diagnoses partly explains the rise in cases (about 25%), it can’t explain all of the increase.

graph showing 600% increase in autism

So, in the 1990s many people were trying to figure out what was causing this increase.

In 1998, a study was published proposing that autism was linked to the MMR vaccine. Even though most of the authors later retracted the study’s conclusions (because the experiment used a small sample and had no control group), a link had been made in the minds of many concerned parents (see Gerber and Offit for the details of this story). Around the same time, an EPA study of mercury exposure discussed thimerosal, a preservative in vaccines that contained mercury. Mercury is a neurotoxin, and so it was easy for parents to seize on the idea that not only was the MMR vaccine a risk, but all vaccines because they have mercury in them. In 1999, despite little evidence that thimerosal-containing vaccines were in any way associated with autism or other harm, authorities ordered their phase-out in favor of thimerosal-free alternatives.

By the end of 2001, the only childhood vaccines still containing thimerosal were flu vaccines, and few children received them. This precautionary step, coupled with a public already concerned by a proposed but unsubstantiated link between vaccination and autism, understandably provoked concern among parents (Gerber and Offit).

But what happened to autism diagnoses?

They continued to rise.

Dozens and dozens of studies conducted since 2001 have looked at vaccines and autism and the science does not support any connection.

“The data support a conclusion of no association between thimerosal-containing vaccines and autism in children,” Dr. Sarah Parker, Journal of Pediatrics (2004).

“There is insufficient evidence to suggest that the MMR vaccine causes autism, either due to the vaccine itself, the presence of thimerosal, or the simultaneous administration of multiple vaccines,” Heather Coates, Medical Reference Services Quarterly (2009).

“20 epidemiologic studies have shown that neither thimerosal nor MMR vaccine causes autism,” Dr. Paul Gerber, Vaccines (2009).

Many of the scientists who authored these studies have made the point that concern about vaccines has diverted attention—and research funding—away from efforts to determine the real cause or causes of autism (e.g., genetic influence, other environmental factors).

So why is there still such fear of vaccines? Why hasn’t attention shifted to more probable causes?

Why do people still believe there is a link? How do we support our false beliefs?

1. Vaccines are given at the very same time that autism is diagnosed, so correlation easily becomes causation. Our brains seem to be wired to notice patterns, to believe that two things that happen at the same time are related. After a mercury link was ruled out, alternative theories emerged, such as the idea that the simultaneous administration of multiple vaccines overwhelms or weakens the immune system and creates an interaction with the nervous system that triggers autism in susceptible children. (Although the number of vaccines administered to children has increased, the total immunologic load has decreased.)

2. We have lost social memory of childhood diseases.

graph showing declines in childhood diseases

Vaccination is considered one of the major success stories in public health. Vaccines have eliminated diseases such as polio and small pox, and prevented outbreaks of mumps, measles, hepatitis, whooping cough. Because these diseases are no longer common, we have forgotten them. The new generation of parents has no memory of these diseases from their own childhood, and so may question the need for vaccines. Side effects or reactions to vaccines appear more common than the diseases the vaccines are preventing, increasing the perception that vaccines represent a risk. The fear of vaccines has replaced fear of childhood diseases.

A theoretical, even disproved risk such as getting autism from a vaccine, is elevated above a real risk of being hospitalized or killed by influenza, or promoting an outbreak of diseases like whooping cough or mumps. Harm resulting from immunization is less acceptable than potential harm from not immunizing. This is a classic example of “omission bias” in which the idea that causing harm through action (commission) is less acceptable than harm that results from inaction (omission). It is better to do nothing (Amanna and Slifka).

Psychology offers additional explanation of how people obtain and process information. Much of this research is focused on political beliefs, but is relevant to any factual information. Unsubstantiated beliefs are maintained via “motivated reasoning,” which suggests that rather than search rationally for the truth, people actually seek out information that confirms what they already believe.

Even when confronted with facts, people don’t necessarily change their belief.

People interpret facts differently; often this difference is illustrated by political affiliation and ideology (Democrat vs. Republican). If a new fact agrees with your belief, you might interpret the fact in an accepting way that strengthens your belief. If the new fact contradicts your belief, you might interpret it in a defensive way and resist changing your belief. Interpretations give individuals leeway to align facts with undeniable realities and yet continue to justify their beliefs and opinions. In the case of the Iraq War, Republicans and Democrats differed in their interpretations of the fact that the United States did not find weapons of mass destruction. Democrats concluded that the weapons did not exist, supporting their opposition to the war. Republicans interpreted the fact to mean that the weapons had been destroyed or moved or not yet found, thus maintaining the rationale for the invasion (Gaines et al. 2007).

When confronted with the fact that mercury is no longer in vaccines, a worried parent might interpret that vaccines themselves, not mercury, are the problem.

Another study looked at whether people change their beliefs when presented with corrected information. Such corrections can actually strengthen misperceptions. The truth can backfire (Nyhan and Reifler).

However, there does seem to be a “tipping point” when enough facts and broader public sentiment, which trigger anxiety, can lead people to change their beliefs (Redlawsk et al.).

More basic emotions also play a role: fear, anxiety, desire.

A study in the journal Sociological Inquiry (Prasad) looked at the strength and resilience of the belief among many Americans that Saddam Hussein was linked to the terrorist attacks of 9/11. The authors concluded that the belief was the result of an urgent need by many Americans to seek justification for a war already in progress. This study argues that the primary cause of misperception in the 9/11-Saddam Hussein case was not the presence or absence of accurate data, but a respondent’s desire to believe in particular kinds of information.

We get attached to our beliefs. They become part of our identity. And so for the people interviewed in this study, the overwhelming evidence that there was no link between Saddam and the 9/11 attacks had no influence on their belief that such a link existed. It had to be true for people to make sense of the war.

So for parents crushed by an autism diagnosis, “there must be a reason.”

The intense desire for an autism cure often leads parent groups and nonprofit organizations to endorse causes and treatments without sufficient evidence of effectiveness. And, for better or worse, it is easier than ever to find (mis)information that supports your belief.

References

Black, Steven, and Rino Rappuoli. 2010. A Crisis of Public Confidence in Vaccines. Science Translational Medicine 2:1-6.

Gaines, Brian J., et al. 2007. Same Facts, Different Interpretations: Partisan Motivation and Opinion on Iraq. The Journal of Politics 69:957-974.

Nyhan, Brendan, and Jason Reifler. 2010. When Corrections Fail: The Persistence of Political Misperceptions. Political Behavior 32:303-330.

Prasad, Monica, et al. 2009. “There Must Be a Reason”: Osama, Saddam, and Inferred Justification. Sociological Inquiry 79:142-162.

Redlawsk, David P., Andrew J.W. Civettini, and Karen M. Emmerson. 2010. The Affective Tipping Point: Do Motivated Reasoners Ever ‘Get It’? Political Psychology 31:563-593.