Being Wrong - Kathryn Schulz [56]
In a perfect world, how would we go about evaluating all this evidence? As it turns out, we have fairly strong and uniform opinions about this. By rough consensus, the ideal thinker approaches a subject with a neutral mind, gathers as much evidence as possible, assesses it coolly, and draws conclusions accordingly. By a further rough consensus, this is not only how we should form our beliefs, but how we actually do so. To quote Rebecca Saxe, the neuroscientist from the previous chapter, “we share the conviction that, in general, beliefs follow from relatively dispassionate assessment of facts-of-the-matter and logical reasoning.”
This model of how our minds work is a significant step up from naïve realism. Instead of thinking, as toddlers do, that the world is exactly as we perceive it, we recognize that we only perceive certain bits of it—pieces of evidence—and that therefore our understanding might be incomplete or misleading. Unlike naïve realism, then, this model of cognition makes room for error. At the same time, it contains an implicit corrective: the more evidence we compile, and the more thoroughly and objectively we evaluate it, the more accurate our beliefs will be. In this vein, Descartes defined error not as believing something that isn’t true, but as believing something based on insufficient evidence.
That definition of error has, at first glance, the virtue of being practical. You can’t very well caution people against believing things that aren’t true, since, as we’ve seen, all of us necessarily think our beliefs are true. By comparison, it seems both easy and advisable to caution people against believing things without sufficient evidence. But this idea quickly runs into trouble. First, how are we supposed to know when a body of evidence crosses the threshold from “insufficient” to “sufficient”? Second, what are we supposed to do in situations where additional evidence is not necessarily forthcoming? Augustine, who arrived at Descartes’ idea of error some 1,200 years earlier, rejected it when he perceived these problems, and in particular their theological implications. If you encourage people to withhold assent from any proposition that lacks sufficient evidence, he realized, you are inevitably encouraging them to withhold assent from God.*
Augustine needn’t have worried. You can urge people not to believe anything based on meager evidence until you are blue in the face, but you will never succeed—because, as it turns out, believing things based on meager evidence is what people do. I don’t mean that we do this occasionally: only when we are thinking sloppily, say, or only if we don’t know any better, or only en route to screwing up. I mean that believing things based on paltry evidence is the engine that drives the entire miraculous machinery of human cognition.
Descartes was right to fear that this way of thinking would cause us to make mistakes; it does. Since he was interested in knowing the truth, and knowing that he knew it, he tried to develop a model of thinking that could curtail the possibility of error. (We’ll hear more about his model later.) In fact, curtailing error was