Being Wrong - Kathryn Schulz [83]
Wittgenstein, then, defended certainty on the grounds that it is sometimes logically necessary—that without being sure of some things, we can’t even begin to think about everything else. (This is an echo, in a deeper register, of Kuhn’s point that we can’t make sense of the world without theories.) James, meanwhile, defended certainty on the grounds that it is sometimes an aid to action, necessary to our survival and success. Each of these defenses points to a third, which is that certainty is evolutionarily advantageous. As I said earlier, taking the time to interrogate a belief requires more cognitive resources—and, potentially, poses a greater risk—than simply accepting it. For this reason, William Hirstein (the author of Brain Fiction) calls doubt “a cognitive luxury,” one that “occurs only in highly developed nervous systems.”
Hirstein has a point; you will be hard-pressed to find a skeptical mollusk. And what goes for our collective evolutionary past also goes for our individual developmental trajectory—which is why you will also be hard-pressed to find a skeptical one-year-old. “The child learns by believing the adult,” Wittgenstein observed. “Doubt comes after belief.” It also comes in different forms and stages. It’s one thing to doubt the existence of Santa Claus, another thing to doubt the accuracy of a news story, and a third thing to doubt the accuracy of a news story you yourself wrote. How adept we are at these different degrees of doubt depends on a variety of factors, including how emotionally capable we are of tolerating uncertainty (more on that in a moment) and how much we have been exposed to and explicitly trained in skeptical inquiry. Doubt, it seems, is a skill—and one that, as we saw earlier, needs to be learned and honed. Credulity, by contrast, appears to be something very like an instinct.
So doubt post-dates belief, both in the long haul of evolution and in the shorter haul of our own emotional and intellectual development. And we can shrink the time frame even further: doubt also seems to come after belief in many individual instances in which we process information about the world. That, at any rate, was the finding of the psychologist Daniel Gilbert and his colleagues, in a 1990 study designed to test an assertion by the Dutch philosopher Baruch Spinoza. Spinoza claimed that when we encounter a new piece of information, we automatically accept it as true, and only reject it as false (if we do so at all) through a separate and subsequent process. This claim ran counter to a more intuitive and—at least according to Descartes—more optimal model of cognition, in which we first weigh the likelihood that a new piece of information is true, and then accept or reject it accordingly. To borrow Gilbert’s example (because who wouldn’t?), consider the following sentence: “armadillos may be lured from a thicket with soft cheese.” If Spinoza is right, then merely by reading this sentence, you are also, however fleetingly, believing it. In this model, belief is our default cognitive setting, while doubt or disbelief requires a second, super-added act.
All of us have experienced something like what Spinoza was getting at. As Gilbert and his colleagues point out, if I’m driving along and I suddenly see a dachshund in the middle of the road, I will swerve my car long before I can decide whether the proposition at hand (“there is a dachshund in the middle of the road”) is true or false. One could take matters a step further and suppose that I would also swerve my car if I saw a unicorn in the middle of the road—even though, if I took the time to contemplate the situation, I would surely conclude that unicorns do not exist, in the middle of the road or anywhere else. In fact, most of us really have swerved in response to imaginary entities. Not long ago I was walking under a scaffolding in Manhattan, when in a flash I found myself jumping aside and covering my head with my arms. Some fluctuation in the light or a trick of my peripheral vision or—who knows?—a random misfiring of my synapses had created the