Mistakes Were Made - Carol Tavris [9]
In 2003, after it had become abundantly clear that there were no weapons of mass destruction in Iraq, Americans who had supported the war and President Bush’s reason for launching it were thrown into dissonance: We believed the president, and we (and he) were wrong. How to resolve this? For Democrats who had thought Saddam Hussein had WMDs, the resolution was relatively easy: The Republicans were wrong again; the president lied, or at least was too eager to listen to faulty information; how foolish of me to believe him. For Republicans, however, the dissonance was sharper. More than half of them resolved it by refusing to accept the evidence, telling a Knowledge Networks poll that they believed the weapons had been found. The survey’s director said, “For some Americans, their desire to support the war may be leading them to screen out information that weapons of mass destruction have not been found. Given the intensive news coverage and high levels of public attention to the topic, this level of misinformation suggests that some Americans may be avoiding having an experience of cognitive dissonance.” You bet.8
Neuroscientists have recently shown that these biases in thinking are built into the very way the brain processes information—all brains, regardless of their owners’ political affiliation. For example, in a study of people who were being monitored by magnetic resonance imaging (MRI) while they were trying to process dissonant or consonant information about George Bush or John Kerry, Drew Westen and his colleagues found that the reasoning areas of the brain virtually shut down when participants were confronted with dissonant information, and the emotion circuits of the brain lit up happily when consonance was restored.9 These mechanisms provide a neurological basis for the observation that once our minds are made up, it is hard to change them.
Indeed, even reading information that goes against your point of view can make you all the more convinced you are right. In one experiment, researchers selected people who either favored or opposed capital punishment and asked them to read two scholarly, well-documented articles on the emotionally charged issue of whether the death penalty deters violent crimes. One article concluded that it did; the other that it didn’t. If the readers were processing information rationally, they would at least realize that the issue is more complex than they had previously believed and would therefore move a bit closer to each other in their beliefs about capital punishment as a deterrence. But dissonance theory predicts that the readers would find a way to distort the two articles. They would find reasons to clasp the confirming article to their bosoms, hailing it as a highly competent piece of work. And they would be supercritical of the disconfirming article, finding minor flaws and magnifying them into major reasons why they need not be influenced by it. This is precisely what happened. Not only did each side discredit the other’s arguments; each side became even more committed to its own. 10
The confirmation bias even sees to it that no evidence—the absence of evidence—is evidence for what we believe. When the FBI and other investigators failed to find any evidence whatsoever for the belief that the nation had been infiltrated by Satanic cults that were ritually slaughtering babies, believers in these cults were unfazed. The absence of evidence, they said, was confirmation of how clever and evil the cult leaders were: They were eating those babies, bones and all. It’s not just fringe cultists and proponents of pop psychology who fall prey to this reasoning. When Franklin D. Roosevelt made the terrible decision to uproot thousands of