Design of Everyday Things [71]
Mistakes, especially when they involve misinterpreting the situation, can take a very long time to be discovered. For one thing, the interpretation is quite reasonable at the time. This is a special problem in a novel situation. The situation may look very much like others we’ve been in; we tend to confuse the rare event with the frequent one.
How many times have you heard a strange noise while driving your car, only to dismiss it as not relevant, or unimportant? How many times does your dog bark in the night, causing you to get up and yell out, “Be quiet!” And what if the car turns out to be broken, and your mistake has increased the damage? Or there really is a burglar outside, but you’ve silenced the dog?
This problem is natural. There are lots of things we could pay attention to or worry about; most would be false alarms, irrelevant minor events. At the other extreme, we can ignore everything, rationally explain each apparent anomaly. Hear a noise that sounds like a pistol shot and explain it away: “Must be a car’s exhaust backfiring.” Hear someone yell out and think, Why can’t my neighbors be quiet? Most of the time we are correct. But when we’re not, our explanations seem stupid and hard to justify.
When there is a devastating accident, people’s explaining away the signs of the impending disaster always seems implausible to others. Afterward, there is a tendency to read about what has taken place and to criticize: “How could those people be so stupid? Fire them. Pass a law against it. Redo the training.” Look at the nuclear power accidents. Operators at Three Mile Island made numerous errors and misdiagnoses, but each one was logical and understandable at the time. The nuclear plant disaster at Chernobyl in the Soviet Union was triggered by a well-intentioned attempt to test the safety features of the plant. The actions seemed logical and sensible to the operators at the time, but now their judgments can be seen to have been erroneous.12
Explaining away errors is a common problem in commercial accidents. Most major accidents follow a series of breakdowns and errors, problem after problem, each making the next more likely. Seldom does a major accident occur without numerous failures: equipment malfunctions, unusual events, a series of apparently unrelated breakdowns and errors that culminate in major disaster; yet no single step has appeared to be serious. In many of these cases, the people involved noted the problem but explained it away, finding a logical explanation for the otherwise deviant observation.
The contrast in our understanding before and after an event can be dramatic. The psychologist Baruch Fischhoff has studied explanations given in hindsight, where events seem completely obvious and predictable after the fact but completely unpredictable beforehand.13
Fischhoff presented people with a number of situations and asked them to predict what would happen: they were correct only at the chance level. He then presented the same situation along with the actual outcome to another group of people, asking them to state how likely the outcome was: when the actual outcome was known, it appeared to be plausible and likely, whereas the others appeared unlikely. When the actual outcome was not known, the various alternatives had quite different plausibility. It is a lot easier to determine what is obvious after it has happened.
SOCIAL PRESSURE AND MISTAKES
A subtle issue that seems to figure in many accidents is social pressure. Although it may not at first seem to be relevant in design, it has strong influence on everyday behavior. In industrial settings social pressures can lead to misinterpretation, mistakes, and accidents. For understanding mistakes, social structure is every bit as essential as physical structure.
Look at airline accidents, not everyday activities for most of us, but subject to the same principles. In 1983, Korean Air flight 007 strayed