Being Wrong - Kathryn Schulz [89]
Still, something tells me that even if we ameliorated this problem—say, by abolishing the electoral college, which would significantly diminish the influence of undecided voters—we would still react to such people with outrage and scorn. After all, if the only thing we cared about was the outcome of the election, we should get far more worked up about the millions of voters who flatly disagree with us than about the slim percentage that isn’t sure. Instead, when push comes to shove, we generally have more fellow-feeling for our political opponents. Those people might want the plate of shit, but at least they agree with us on this much: some things are so important that everyone should take a definite stand on them.
This is why undecided voters drive us crazy. They think hard about something that most of us don’t have to think about at all. Confronted by a choice that we find patently obvious, they are unsure what to believe, and so they hesitate, vacillate, wait for more information. In other contexts, such actions seem reasonable, even laudable. In fact, they comport pretty closely with the ideal thinker I introduced back in our discussion of evidence. This isn’t to say that the average undecided voter represents some kind of optimal philosopher-citizen whom we should all seek to emulate. (For starters, as we saw earlier, that ideal thinker isn’t so ideal in the first place.) What these voters do represent, however, are possibilities the rest of us often foreclose: the ability to experience uncertainty about even hugely important beliefs; the ability to wonder, right up until the moment that the die is cast, if we might be wrong.
If the undecided voter has a strong suit, that is it: she knows that she could be wrong. If the rest of us have a strong suit, it is that we care, passionately, about our beliefs. As conflicting as these two strengths might initially seem, they can, in theory, be reconciled. The psychologist Rollo May once wrote about the “seeming contradiction that we must be fully committed, but we must also be aware at the same time that we might possibly be wrong.” Note that this is not an argument for centrism, or for abandoning the courage of our convictions. May’s point was precisely that we can retain our convictions—and our conviction—while jettisoning the barricade of certainty that surrounds them. Our commitment to an idea, he concluded, “is healthiest when it is not without doubt, but in spite of doubt.”
Most of us do not want to be doctrinaire. Most of us do not want to be zealots. And yet it is bitterly hard to put May’s maxim into practice. Even with the best of intentions, we are often unable to relinquish certainty about our beliefs. One obstacle to doing so is the feeling of being right, shored up as it is by everything from our sensory impressions to our social relations to the structure of human cognition. But a second and paradoxical obstacle is our fear of being wrong. True, certainty cannot protect us from error, any more than shouting a belief can make it true. But it can and does shield us, at least temporarily, from facing our fallibility.
The psychologist Leon Festinger documented this protective effect of certainty in the 1950s, in the study that gave us the now-famous term “cognitive dissonance.” Along with several colleagues and hired observers, Festinger infiltrated a group of people who believed in the doomsday prophecies of a suburban housewife named (actually, pseudonymed) Marian Keech. Keech claimed that she was in touch with a Jesuslike figure from outer space who sent her messages about alien visits, spaceship landings, and the impending destruction of the world