Irrational Economist_ Making Decisions in a Dangerous World - Erwann Michel-Kerjan [50]
Contemplated risks are those that have not occurred but are recognized. For an individual who has never had a heart attack, the possibility of one may fall into this category. For the United States, the chance of an avian flu pandemic is a contemplated risk: A massive outbreak has not yet happened, but newspapers have discussed this possibility and public and private officials are preparing for it.
Experienced risks, our third category, are risks we both think about and have experienced before. The flu, a fender bender, or a computer crash is an experienced risk for most individuals.
Neglected risks, the last category, have occurred but are not currently contemplated. For example, a century ago an asteroid exploded over Tunguska, Siberia, with 1,000 times the power of the atomic bomb on Hiroshima; it toppled 80 million trees. But the risk of an asteroid explosion has drifted out of public consciousness into the category of neglected risk (leaving aside, of course, NASA’s Asteroid and Comet Impact Hazards Group).
This chapter focuses on learning from the occurrence of an extreme event. For many risks, the probability of an occurrence is unknown and may be changing over time. In these cases, even experts must rely on conjectural or subjective assessments of the risk, as we see with climate change. When new information comes to light, such as the occurrence of an extreme event, individuals, institutions, and society must update their assessment of the risk. New information from other sources, such as scientific studies, will also lead to learning. For example, we learned about climate risks from a 2-mile-long ice core from Antarctica that extended the climate record back an additional 210,000 years.
In theory, a rational observer, following the receipt of new information, would update her assessment of the risk given new information. The theoretical model for rational updating, a mathematical formula called Bayes’ Rule, is discussed in the next section. Unfortunately, most people are not equipped—mentally or mathematically—to be so ideally rational. Therefore, individuals and society will alter their expectations about extreme events in a nonscientific and often biased fashion. This chapter explores a two-part conjecture: (1) After the occurrence of a virgin risk, people will overestimate the probability of another occurrence in the near future; (2) by contrast, after an experienced risk occurs, people will under-update their assessment of another event occurring soon.
THE INABILITY TO USE BAYESIAN UPDATING IN EVERYDAY PRACTICE
Risks are often posited to have an unknown true probability. The textbook model for how to proceed employs Bayes’ Rule (after eighteenth-century British mathematician Thomas Bayes), which shows mathematically how people should rationally change their existing beliefs about something in light of new evidence. Individuals use information available beforehand to form a so-called prior belief about the probability that an event will occur in a given period. New evidence about the risk is captured in something called a likelihood function, which expresses how plausible the evidence is given each possible value of the probability. The prior belief and the likelihood function are then combined to produce what is called a posterior distribution, which is simply the updated version of the prior probability. This requires thinking about a probability model and all its important parameters, and assessing precisely what their distribution was before the occurrence.
While this process might work well in theory, it is rarely used in a complete way in everyday practice, even by those with training in decision analysis. We mention a few prominent explanations among many. Individuals and policy makers are unlikely to have prior beliefs in their heads, especially for virgin risks, and not thinking about every possible risk before it occurs is a perfectly sensible way to organize one’s life. However, people often neglect