The Believing Brain - Michael Shermer [145]
Program C: Four hundred people will die.
Program D: There is a one-third probability that nobody will die, and a two-thirds probability that all six hundred people will die.
Even though the net result of the second set of choices is precisely the same as the first, subjects switched preferences, from 72 percent for Program A to 78 percent for Program D. The framing of the question led to the shift in preference. We prefer to think in terms of how many people we may save instead of how many people will die—the “positive frame” is preferred over the “negative frame.”23
Anchoring Bias
Lacking some objective standard to evaluate beliefs and decisions—which is usually not available—we grasp for any standard on hand, no matter how seemingly subjective. Such standards are called anchors, and this creates the anchoring effect, or the tendency to rely too heavily on a past reference or on one piece of information when making decisions. The comparison anchor can even be entirely arbitrary. In one study subjects were asked to give the last four digits of their Social Security numbers, and then asked to estimate the number of physicians in New York City. Bizarrely, people with higher Social Security numbers tended to give higher estimates for the number of docs in Manhattan. In a related study, subjects were shown an array of items to purchase—a bottle of wine, a cordless computer keyboard, a video game—and were then told that the value of the items was equal to the last two digits of their Social Security numbers. When subsequently asked the maximum price they would be willing to pay, subjects with high Social Security numbers consistently said that they would be willing to pay more than those with low numbers. With no objective anchor for comparison, this random anchor influenced them arbitrarily.
Our intuitive sense of the anchoring effect and its power leads negotiators in corporate mergers, representatives in business deals, and even disputants in divorces to begin from an extreme initial position in order to set the anchor high for their side.
Availability Heuristic
Have you ever noticed how many red lights you encounter while driving when you are late for an appointment? Me, too. How does the universe know that I left late? It doesn’t, of course, but the fact that most of us notice more red lights when we are running late is an example of the availability heuristic, or the tendency to assign probabilities of potential outcomes based on examples that are immediately available to us, especially those that are vivid, unusual, or emotionally charged, which are then generalized into conclusions upon which choices are based.24
For example, your estimation of the probability of dying in a plane crash (or lightning strike, shark attack, terrorist attack, and so on) will be directly related to the availability of just such an event in your world, especially your exposure to it in mass media. If newspapers and especially television cover an event there is a good chance that people will overestimate the probability of that event happening.25 An Emory University study, for example, revealed that the leading cause of death in men—heart disease—received the same amount of media coverage as the eleventh-ranked cause: homicide. In addition, drug use—the lowest-ranking risk factor associated with serious illness and death—received as much attention as the second-ranked risk factor of poor diet and lack of exercise. Other studies have found that women in their forties believe they have a 1 in 10 chance of dying from breast cancer, while their real lifetime odds are more like 1 in 250. This effect is directly related to the number of news stories about breast cancer.26
Representative Bias
Related to the availability bias is the representative bias, which, as described by its discoverers, psychologists Amos Tversky and Daniel Kahneman, means: “an event is judged probable to the extent that it represents the essential features of its parent population or generating process.