This weekend, on the outside chance that the rapture does not occur (I know I know, just humor me)… Please remember to give some love to Leon Festinger. Everything that happens next, he called it.
As the Arizona legislature debates a bill to demand proof of citizenship from presidential candidates, Slate has a new piece about why the “birther” movement won’t go away (which I came across thanks to Eric Knowles). Of particular interest:
The irony of all the birth-certificate proposals—similar bills have been introduced in six states—is that they contain the seeds of the birther movement’s destruction. The moment Obama calls their bluff and hands his birth certificate to the Arizona secretary of state, it’s over.
In theory. That’s the beauty of the birther myth, or any conspiracy theory: No amount of evidence can ever completely dispel the questions. When Obama produced his Hawaii birth certificate and the state of Hawaii verified it, it was a fake. When reporters uncovered announcements of Obama’s birth in 1961 copies of the Honolulu Advertiser and the Honolulu Star-Bulletin, they had been planted. If the Arizona secretary of state verified Obama’s birth certificate, that would be due to the government mind-control chip implanted in his molar.
To put all this another way: Birtherism is here to stay. And not because more people are going crazy, but because crazy has been redefined…
The article puts a bit of a partisan spin on the underlying psychological explanation (oh those crazy conservatives!). But this is just another manifestation of a basic psychological process documented in the 1950s by Leon Festinger (which I’ve previously discussed in relation to the persistence of the discredited vaccine-autism link).
When people have a deeply held belief that they have publicly committed to, disconfirmatory evidence doesn’t necessarily weaken the belief. Instead, Festinger predicted — based on cognitive dissonance theory — that under the right circumstances, disconfirmatory evidence can make beliefs grow stronger and metastasize. Festinger first documented this phenonemon among doomsday cultists, and it plays out again and again among modern conspiracy theories from the left, right, and in between.
Where this all intersects with politics, as the Slate piece points out, is in how politicians can fan and exploit conspiracies. Festinger laid out five conditions for disconformatory evidence to intensify belief. The fifth is, “After the discomfirming evidence comes to light, the believer has social support from other believers.” Politicians speaking in code can sound to believers like sympathetic supporters, while still leaving themselves plausible deniability in more rational circles. I don’t know how many have read Festinger, but they almost certainly know what they’re doing.
Update: Brendan Nyhan sent me a link to a forthcoming paper of his (with Jason Reifler) titled When Corrections Fail: The Persistence of Political Misperceptions. In a series of experiments, they show that correcting misperceptions can backfire when the correction runs contrary to somebody’s political beliefs. For example, conservatives became more certain that Iraq had WMDs prior to the war when they read news articles that corrected that misperception. Likewise for liberals and the misperception that Bush banned all stem cell research.
Their paper doesn’t test all aspects of Festinger’s model, though it does draw on work in the cog-dissonance tradition. Festinger thought that the social context of beliefs was important: you have to publicly commit to your beliefs, and after receiving disconfirmatory evidence you have to have social support from fellow believers. Nyhan and Reifler’s experiment dealt with subjects’ pre-existing beliefs, so it’s entirely possible that those elements were part of the prior experience that subjects brought into the lab. It would be interesting to run an experiment to see if you could modulate the backfire effect by amplifying or dampening those factors experimentally.
What do vaccines and the end of the world have in common? To some activists, it might seem like the former is going to bring about the latter. To the rest of us, there may be a more subtle connection.
A new article in the NEJM examines the characteristics of families that refuse vaccination. Chris Mooney blogs about it at The Intersection, noting that the families that refuse vaccination tend to seek medical information from a tightly interconnected community of alternative healers and anti-vaccination advocates, rather than relying on the scientific or medical establishment.
Mooney also has a piece in Discover about why the vaccination-autism controversy persists. I was particularly struck by this passage:
Meanwhile, in the face of powerful evidence against two of its strongest initial hypotheses—concerning MMR and thimerosal—the vaccine skeptic movement is morphing before our eyes. Advocates have begun moving the goalposts, now claiming, for instance, that the childhood vaccination schedule hits kids with too many vaccines at once, overwhelming their immune systems. Jenny McCarthy wants to “,” pointing to many other alleged toxins that they contain. “I think it’s definitely a response to the science, which has consistently shown no correlation,” says David Gorski, a cancer surgeon funded by the National Institutes of Health who in his spare time blogs at , a top medical blog known for its provaccine stance. A hardening of antivaccine attitudes, mixed with the despair experienced by families living under the strain of autism, has heightened the debate—sometimes leading to blowback against scientific researchers.
What this immediately reminded me of was Leon Festinger’s book When Prophecy Fails.
Festinger was a social psychologist who developed the theory of cognitive dissonance. WPF is an account of how Festinger and his colleagues infiltrated a cult that had predicted that the world would end on a specific day (December 20, 1954). He wanted to see what would happen when the predicted day came and went without incident.
Ordinary logic would suggest that if your theory makes an incorrect prediction, you should go back and question the theory. But what happened instead was that the disconfirming evidence actually strengthened the cultists’ beliefs. They decided that God had temporarily spared the world because of their dedication, and they became even more committed to trying to spread their views before the revised apocalypse date came around.
Drawing on cognitive dissonance theory, Festinger explained why the group’s beliefs were strengthened. Broadening beyond just cults, he outlined five conditions that will lead people to intensify their beliefs in the face of disconfirming evidence:
- The belief is deeply held
- The believer has taken public actions that reflect his/her commitment and that cannot be undone
- The belief leads to specific, falsifiable predictions about something that will happen
- The specific, falsifiable predictions are disconfirmed by objective evidence
- After the discomfirming evidence comes to light, the believer has social support from other believers
Under these conditions, Festinger argued, it is well near impossible to reverse the belief. You hold it too dearly, you’ve already committed yourself to it, and other people are telling you to hang in there. So instead, you try to figure out how you can twist or morph your belief in order to hold on to it.
Festinger’s account of these conditions and consequences does a strikingly good job of describing the arc of the vaccination-autism story. In the late 1990s, parents and activists became convinced that vaccines were causing autism. They developed two specific predictions. First, they proposed that the MMR vaccine triggered a release of toxins that had accumulated in the intestines; but the data failed to support this view and most of the scientists who first proposed it retracted the conclusion. Activists also argued that thimerosol (a preservative in vaccines that contains mercury) was responsible for the link; but when thimerosol was removed from most vaccines, autism rates didn’t go down. So now, as Mooney describes it, activists are once again “moving the goalposts.” They are now blaming other toxins or saying that vaccination schedules are at fault. The belief has morphed, but key elements of its original form are preserved (it’s something about vaccines), allowing them to feel like they haven’t been wrong.
Unfortunately, a broader casualty of this process is parents’ confidence in science and medicine. Festinger showed that disconfirming evidence didn’t just lead believers to believe more; it led them to proselytize more too. So now we face an increasingly outspoken group of sympathetic and passionate advocates telling parents not to believe scientists and doctors. And that undermined confidence has very real and dangerous consequences.
What should be done? Frankly, I’m not entirely sure. One problem, it seems to me, is how this debate is framed as “science” or “scientists” versus the anti-vaccine activists. Science is supposed to be the rules of the game and scientists the referees, not an opponent in it. But I’m not sure there’s any way around that. If one side doesn’t like where the rules lead to, who else are they going to blame?