unSpun_ Finding Facts in a World of Disinformation - Brooks Jackson [29]
However, by this time most of the comments on the “Town Meeting” board reflected a view of reporter Leopold and editor Ash as doggedly self-deceived, or worse. One wrote, “I’ve been on your side all along and am a great believer in TO [truthout], but this borders on lunacy.” And another said, “There’s probably a 12 step program out there for this affliction. You believe, with all your heart, soul, mind and body that a story is correct. You base your belief, not on evidence, logic or reason, but simply because you want to believe so badly, the thought of it being wrong invalidates your very existence and that makes your head hurt.” That didn’t please truthout’s proprietors, who removed the comment. But it describes the effects of cognitive dissonance pretty well.
Extreme political partisans sometimes display such irrational thinking that they have come to be dismissed as “barking moonbats.” A barking moonbat is “someone who sacrifices sanity for the sake of consistency,” according to the London blogger Adriana Cronin-Lukas of Samizdata.net, who helped popularize the colorful phrase. It is most often applied derisively to extreme partisans on the left, but we use it as originally intended, to apply to all farout cases whose beliefs make them oblivious to facts, regardless of party or ideology.
The Psychology of Deception
In the past half century, the science of psychology has taught us a lot about how and why we get things wrong. As we’ll see, our minds betray us not only when it comes to politics, but in all sorts of matters, from how we see a sporting event, or even a war, to the way we process a sales pitch. Humans are not by nature the fact-driven, rational beings we like to think we are. We get the facts wrong more often than we think we do. And we do so in predictable ways: we engage in wishful thinking. We embrace information that supports our beliefs and reject evidence that challenges them. Our minds tend to take shortcuts, which require some effort to avoid. Only a few of us go to moonbat extremes, but more often than most of us would imagine, the human mind operates in ways that defy logic.
Psychological experiments have shown, for one thing, that humans tend to seek out even weak evidence to support their existing beliefs, and to ignore evidence that undercuts those beliefs. In the process, we apply stringent tests to evidence we don’t want to hear, while letting slide uncritically into our minds any information that suits our needs. Psychology also tells us that we rarely work through reasons and evidence in a systematic way, weighing information carefully and suspending the impulse to draw conclusions. Instead, much of the time we use mental shortcuts or rules of thumb that save us mental effort. These habits often work reasonably well, but they also can lead us to conclusions we might dismiss if we applied more thought.
Another common psychological trap is that we tend to overgeneralize from vivid, dramatic single examples. News of a terrible airline crash makes us think of commercial flying as dangerous; we forget that more than 10 million airline passenger flights land safely every year in the United States alone. Overhyped coverage of a few