Being Wrong - Kathryn Schulz [152]
All of these error-prevention techniques—from Six Sigma to the innovations of the airline industry to the efforts at Beth Israel—have three key elements in common. The first, as I’ve indicated, is acceptance of the likelihood of error. That’s why officials at BIDMC set about trying to determine “all the differing ways patients get hurt.” And it’s why Six Sigma analysts systematically imagine the failure of every component of a product or process, the likely implications of that failure, and the best ways to stave it off (a technique borrowed from an early quality-control measure known as Failure Mode and Effects Analysis). In fact, even as Six Sigma aims for near perfection, it also strives to build into companies and processes a “tolerance for failure.” That is, it seeks to foster both awareness of the possibility of screw-ups and strong risk-management strategies, so that any error that does occur will be a “safe failure.”
The second element these error-prevention strategies have in common is openness. Perhaps the most dramatic difference between Paul Levy’s handling of his hospital’s wrong-side surgery and more conventional reactions to medical error was its extreme transparency. The hospital went as far as it could toward widespread and detailed acknowledgment of the error without compromising the privacy of the patient. As that suggests, this openness wasn’t done on behalf of the patient, as a kind of bigger, showier apology. (The hospital was open with the patient, and did apologize, but such apologies should transpire only between the medical team, the appropriate hospital administrators, the patient, and the patient’s family.) It’s more accurate to say that it was done on behalf of future patients. The point of the very public admission of error was to ensure that everyone in the hospital learned as much as possible from it, to remind staff members to follow the error-prevention protocols already in place, and to generate as many new ideas as possible for preventing future problems. A similar recognition of the importance of openness spurred the airline industry to create a culture in which crew and ground members are encouraged (and in some cases even required) to report mistakes and are protected from punishment and litigation if they do so. Likewise, GE, one of the early Six Sigma adopters, claims that to eliminate error, it has “opened our culture to ideas from everyone, everywhere, decimated the bureaucracy and made boundaryless behavior a reflexive, natural part of our culture.”†
The final element that all error-deterrent systems have in common is a reliance on verifiable data—what Six Sigma analysts call “management by fact” rather than by “opinions and assumptions.” One of the great mysteries of what went awry in the wrong-side surgery at BIDMC was that, as Kenneth Sands put it, “for whatever reason, [the surgeon] simply felt that he was on the correct side” of the patient. “Whatever reason” and “simply feeling” are precisely the kinds of cues that error-proofing processes seek to override. That’s why these processes place so much emphasis on verifying even small, seemingly straightforward aspects of the procedure in question. Think about the last time you locked your keys in your car because you assumed you’d tossed them into your bag as usual: we know from our own experience that one way we err is through the failure of tasks that are so obvious or automated that we seldom bother to double-check them.
Relying on hard data, committing to open and democratic communication, acknowledging fallibility: these are the central tenets of any system that aims to protect us from error. They are also markedly different from how we normally think—from our often hasty and asymmetric treatment of evidence, from the cloistering effects of insular communities, and from our instinctive recourse to defensiveness and denial. In fact, the whole reason these