Online Book Reader

Home Category

Design of Everyday Things [121]

By Root 2542 0
Normal accidents.

13 Fischhoff’s (1975) study is called “Hindsight # foresight: The effect of outcome knowledge on judgment under uncertainty.” And while you are at it, see the very impressive book of readings entitled Acceptable risk (Fischhoff, Lichtenstein, Slovic, Derby, & Keeney, 1981).

14 The Korean Air flight 007 has been analyzed by Hersh (1986), who gives a plausible, detailed account of what might have gone wrong with the flight. Because the aircraft flight recorders were not recovered, we will never know with certainty what did happen. It appears that the actions on the Soviet side were probably equally confused, with pilots and the military under various social pressures to act. The information available about the Soviets’ actions is insufficient to reach any reliable conclusions.

15 My source for information about the Tenerife crash is Roitsch, Babcock, & Edmunds (undated), the report issued by the American Airline Pilots Association. It is perhaps not too surprising that it differs in interpretation from the Spanish government’s report (Spanish Ministry of Transport and Communications, 1978), which in turn differs from the report by the Dutch Aircraft Accident Inquiry Board (1979). See also how Weiner treats the crash and its aftermath (Weiner, 1980; reprinted in Hurst & Hurst, 1982). (Weiner calls the episode the result of the Realpolitik of a system that “emphasizes airspace allocation and political compromise, rather than dealing directly with the variety of problems facing pilots and controllers.”)

The information and quotations about the Air Florida crash are from the report of the National Transportation Safety Board (1982). An excellent review of the social pressures can be found in Weiner (1986) and in two books entitled Pilot error (Hurst, 1976; Hurst & Hurst, 1982). (The two books are quite different. The second is better than the first, in part because at the time the first book was written, not much scientific evidence was available.)

16 Warning signals can be designed properly. Roy Patterson at the Medical Research Council’s Applied Psychology Unit in Cambridge, England, has devised a systematic set of procedures for conveying the meaning and importance of a problem with a carefully controlled sequence of sounds, where the frequency, intensity, and rate of presentation identifies the problem and indicates the seriousness. The scheme can be applied wherever a number of devices all require warning sounds, such as in aircraft cockpits or hospital operating rooms. It has been proposed as an international standard for warnings, and it is now working its way slowly through the societies and committees that must approve such things.One problem has always been knowing how loud to make the signal. The common solution is to make it very loud. Patterson points out that the sound level that is required depends on what else is happening. When an airplane is taking off, loud warnings are needed. When it is in level, continuous flight, low levels will do. Patterson’s system has a variable loudness: the warning sound starts off softly, then repeats with increasing sound intensity until the signal is acknowledged.

Modern technology makes it possible to have machines talk, either by storing a compressed waveform or by synthesizing a voice. This approach, like all approaches, has it strengths and weaknesses. It allows for precise information to be conveyed, especially when the person’s visual attention is being directed elsewhere. But if several speech warnings operate at the same time, or if the environment is noisy, speech warnings cannot be understood. Or if conversations among the users or operators are necessary, speech warnings will interfere. Speech warning signals can be effective, but only if used intelligently.

17 I discuss the idea of designing for error in a paper in the Communications of the ACM, in which I analyze a number of the slips people make in using computer systems and suggest system design principles that might minimize those errors (Norman, 1983). This philosophy also

Return Main Page Previous Page Next Page

®Online Book Reader