Design of Everyday Things [63]
I was in a taxi in Austin, Texas, admiring the large number of new devices in front of the driver. No more simple radio. In its place was a computer display, so that messages from the dispatcher were now printed on the screen. The driver took great delight in demonstrating all the features to me. On the radio transmitter I saw four identical-looking buttons laid out in a row.
“Oh,” I said, “you have four different radio channels.”
“Nope,” he replied, “three. The fourth button resets all the settings. Then it takes me thirty minutes to get everything all set up properly again.”
“Hmm,” I said, “I bet you hit that every now and then by accident.”
“I certainly do,” he replied (in his own unprintable words).
In computer systems, it is common to prevent errors by requiring confirmation before a command will be executed, especially when the action will destroy a file. But the request is ill timed; it comes just after the person has initiated the action and is still fully content with the choice. The standard interaction goes something like this:USER: Remove file “My-most-important-work.”
COMPUTER: Are you certain you wish to remove the file “My-most-important-work”?
USER: Yes.
COMPUTER: Are you certain?
USER: Yes, of course.
COMPUTER: The file “My-most-important-work” has been removed. USER: Oops, damn.
The user has requested deletion of the wrong file but the computer’s request for confirmation is unlikely to catch the error; the user is confirming the action, not the file name. Thus asking for confirmation cannot catch all slips. It would be more appropriate to eliminate irreversible actions: in this example, the request to remove a file would be handled by the computer’s moving the file to some temporary holding place. Then the user would have time for reconsideration and recovery.
At a research laboratory I once directed, we discovered that people would frequently throw away their records and notes, only to discover the next day that they needed them again. We solved the problem by getting seven trash cans and labeling them with the days of the week.
Then the trash can labeled Wednesday would be used only on Wednesdays. At the end of the day it was safely stored away and not emptied until the next Tuesday, just before it was to be used again.
People discovered that they kept neater records and books because they no longer hesitated to throw away things that they thought would probably never be used again; they figured it was safe to throw something away, for they still had a week in which to change their minds.
But design is often a tradeoff. We had to make room for the six reserve wastebaskets, and we had a never-ending struggle with the janitorial staff, who kept trying to empty all of the wastebaskets every evening. The users of the computer center came to depend upon the “soft” nature of the wastebaskets and would discard things that they otherwise might have kept for a while longer. When there was an error—sometimes on the part of the janitorial staff, sometimes on our part in cycling the wastebaskets properly—then it was a calamity. When you build an error-tolerant mechanism, people come to rely upon it, so it had better be reliable.
Mistakes as Errors of Thought
Mistakes result from the choice of inappropriate goals. A person makes a poor decision, misclassifies a situation, or fails to take all the relevant factors into account. Many mistakes arise from the vagaries of human thought, often because people