Online Book Reader

Home Category

Being Wrong - Kathryn Schulz [53]

By Root 903 0
so pejoratively. If I suggest that the CEO of Green Tea Incorporated stands to gain financially by believing in the benefits of green tea, I’m at the very least implying that she isn’t qualified to judge the truth of her belief—and, more likely, I am implying that there is no truth to her belief.* In other words, if we want to discredit a belief, we will argue that it is advantageous, whereas if we want to champion it, we will argue that it is true. That’s why we downplay or dismiss the self-serving aspects of our own convictions, even as we are quick to detect them in other people’s beliefs.

Psychologists refer to this asymmetry as “the bias blind spot.” The bias blind spot can be partly explained by the Lake Wobegon Effect, that endlessly entertaining statistical debacle whereby we all think that we are above average in every respect—including, amusingly, impartiality. But a second factor is that we can look into our own minds, yet not into anyone else’s. This produces a methodological asymmetry: we draw conclusions about other people’s biases based on external appearances—on whether their beliefs seem to serve their interests—whereas we draw conclusions about our own biases based on introspection. Since, as we’ve seen, much of belief-formation neither takes place in nor leaves traces in conscious thought, our conclusions about our own biases are almost always exculpatory. At most, we might acknowledge the existence of factors that could have prejudiced us, while determining that, in the end, they did not. Unsurprisingly, this method of assessing bias is singularly unconvincing to anyone but ourselves. As the Princeton psychologist Emily Pronin and her colleagues observed in a study of the bias blind spot, “we are not particularly comforted when others assure us that they have looked into their own hearts and minds and concluded that they have been fair and objective.”

So we look into our hearts and see objectivity; we look into our minds and see rationality; we look at our beliefs and see reality. This is the essence of the ’Cuz It’s True Constraint: every one of us confuses our models of the world with the world itself—not occasionally or accidentally but necessarily. This is a powerful phenomenon, and it sets in motion a cascade of corollaries that determines how we deal with challenges to our belief systems—not, alas, for the better.

The first such corollary is the Ignorance Assumption. Since we think our own beliefs are based on the facts, we conclude that people who disagree with us just haven’t been exposed to the right information, and that such exposure would inevitably bring them over to our team. This assumption is extraordinarily widespread. To cite only the most obvious examples, all religious evangelism and a good deal of political activism (especially grass-roots activism) is premised on the conviction that you can change people’s beliefs by educating them on the issues.

The Ignorance Assumption isn’t always wrong; sometimes our ideological adversaries don’t know the facts. But it isn’t always right, either. For starters, ignorance isn’t necessarily a vacuum waiting to be filled; just as often, it is a wall, actively maintained. More to the point, though, the Ignorance Assumption can be wrong because we can be wrong: the facts might contradict our own beliefs, not those of our adversaries. Alternatively, the facts might be sufficiently ambiguous to support multiple interpretations. That we generally ignore this possibility speaks to the powerful asymmetry of the Ignorance Assumption. When other people reject our beliefs, we think they lack good information. When we reject their beliefs, we think we possess good judgment.

When the Ignorance Assumption fails us—when people stubbornly persist in disagreeing with us even after we’ve tried to enlighten them—we move on to the Idiocy Assumption. Here, we concede that our opponents know the facts, but deny that they have the brains to comprehend them. This assumption can be a narrow judgment, applied to a specific person on a specific issue, or it can be a

Return Main Page Previous Page Next Page

®Online Book Reader