unSpun_ Finding Facts in a World of Disinformation - Brooks Jackson [30]
Psychologists have also found that when we feel most strongly that we are right, we may in fact be wrong. And they have found that people making an argument or a supposedly factual claim can manipulate us by the words they choose and the way they present their case. We can’t avoid letting language do our thinking for us, but we can become more aware of how and when language is steering us toward a conclusion that, upon reflection, we might choose to reject. In this chapter we’ll describe ways in which we get spun, and how to avoid the psychological pitfalls that lead us to ignore facts or believe bad information.
And this has nothing to do with intelligence. Presidents, poets, and even professors and journalists fall into these traps. You do it. We all do it. But we are less likely to do it if we learn to recognize how our minds can trick us, and learn to step around the mental snares nature has set for us. The late Amos Tversky, a psychologist who pioneered the study of decision errors in the 1970s, frequently said, “Whenever there is a simple error that most laymen fall for, there is always a slightly more sophisticated version of the same problem that experts fall for.”1
Experts such as the doctors, lawyers, and college professors who made up a group convened in 1988 by Kathleen Jamieson to study the impact of political ads. They all thought other people were being fooled by advertising, but that they themselves were not. They were asked, “What effect, if any, do you think the presidential ads are having on voters in general and us in particular?” Every person in the group said that voters were being manipulated. But each one also insisted that he or she was invulnerable to persuasion by ads. Then the discussion moved on to the question “What ads have you seen in the past week?” One participant recounted that he had seen an ad on Democratic presidential candidate Michael Dukakis’s “revolving-door prison policy.” The group then spent time talking about Dukakis’s policies on crime control, and in that discussion each adopted the language used in the Republican ad. The ad’s language had begun to do their thinking about Dukakis for them. Without being aware of it, they had absorbed and embraced the metaphor that Dukakis had installed a “revolving door” on state prisons. The notion that others will be affected by exposure to messages while we are immune is called the third-person effect.
The other side of the third-person effect is wishful thinking. Why do most people think they are more likely than they actually are to live past eighty? Why do most believe that they are better than average drivers? At times we all seem to live in Garrison Keillor’s fictional Lake Wobegon, where “all the children are above average.” To put it differently: in some matters we are unrealistic about how unrealistic we actually are.
The “Pictures in Our Heads” Trap
Misinformation is easy to accept if it reinforces what we already believe. The journalist Walter Lippman observed in 1922 that we are all captives of the “pictures in our heads,” which don’t always conform to the real world. “The way in which the world is imagined determines at any particular moment what men will do,” Lippman wrote in his book Public Opinion. Deceivers are aware of the human tendency to think in terms of stereotypes, and they exploit it.
A good example of this is the way the Bush campaign in 2004 got away with falsely accusing John Kerry of repeatedly trying to raise gasoline taxes and of favoring a 50-cent increase at a time when fuel prices were soaring. A typical ad said, “He supported higher gasoline taxes eleven times.” In fact, Kerry had voted for a single 4.3-cent increase, in 1993. The following year, he flirted briefly in a newspaper interview with the idea of a 50-cent-a-gallon gasoline