Being Wrong - Kathryn Schulz [62]
Still, however defensible confirmation bias might be, it deals another blow to our ideal thinker. That’s the admirable soul we met earlier in this chapter, the one who gathers as much evidence as possible and assesses it neutrally before reaching a conclusion. We already saw inductive reasoning upend the first half of this ideal. We don’t gather the maximum possible evidence in order to reach a conclusion; we reach the maximum possible conclusion based on the barest minimum of evidence. Now it turns out that inductive reasoning upends the second half as well. We don’t assess evidence neutrally; we assess it in light of whatever theories we’ve already formed on the basis of whatever other, earlier evidence we have encountered.
This idea was given its most influential treatment by Thomas Kuhn, the historian and philosopher of science, in his 1962 work The Structure of Scientific Revolutions. Before Kuhn, scientists were generally regarded as the apotheosis of the above-mentioned ideal thinker. These epistemologically prudent souls were thought to opt for logic over guesswork, reject verification (looking for white swans) in favor of falsification (looking for black swans), test their hypotheses extensively, collect and analyze evidence neutrally, and only then arrive at their theories. Kuhn challenged this notion, arguing that—among other problems with this model—it is impossible to do science in the absence of a preexisting theory.
Kuhn didn’t mean this as a criticism, or at least not only as a criticism. Without some kind of belief system in place, he posited, we wouldn’t even know what kinds of questions to ask, let alone how to make sense of the answers. Far from freeing us up to regard the evidence neutrally, the absence of theories would render us unable to figure out what even counted as evidence, or what it should be counted as evidence for. Kuhn’s great insight was that preexisting theories are necessary to the kind of inquiry that is the very essence of science. And the history of the field bears this out: science is full of examples of how faith in a theory has led people to the evidence, rather than evidence leading them to the theory. In the early nineteenth century, for instance, astronomers were puzzled by perturbations in the orbit of Uranus that seemed to contradict Newtonian mechanics. Because they weren’t prepared to jettison Newton, they posited the existence of an unknown planet whose gravitational pull was affecting Uranus’s path, and calculated that planet’s necessary orbit around the sun. Guided by this work, later astronomers with better telescopes took a second look at the sky and, sure enough, discovered Neptune—less than one degree away from where the theorists had predicted it would be.*
The discovery of Neptune is a crystalline illustration of Kuhn’s point that theories represent the beginning of science as much as its endpoint. Theories tell us what questions to ask (“Why is Uranus’s orbit out of whack?”), what kind of answers to look for (“Something really big must be exerting a gravitational pull on it.”) and where to find them (“According to Newtonian calculations, that big thing must be over there.”). They also tell us what not to look for and what questions not to ask, which is why those astronomers didn’t bother looking for a giant intergalactic battleship warping Uranus’s orbit instead. These are invaluable directives, prerequisite to doing science—or, for that matter, to doing much of anything. As Alan Greenspan pointed out, during a moment in his congressional testimony when he seemed to be coming under fire for merely possessing a political ideology, “An ideology is a conceptual framework, the way people deal with reality. Everyone has one. You have to. To exist, you need an ideology.”
Greenspan was right. To exist, to deal with reality, we need a conceptual framework: theories that tell us which questions to ask and which ones not to, where to look and where not