Being Wrong - Kathryn Schulz [96]
Whatever might have made me pay more attention to this man, in other words, had nothing at all to do with how right he was. This is, unfortunately, a universal truth. Sometimes people succeed in showing us our errors and sometimes they fail, but only rarely does that success or failure hinge on the accuracy of their information. Instead, as we saw in our discussion of community, it has almost everything to do with the interpersonal forces at work: trust or mistrust, attraction or repulsion, identification or alienation. It’s no surprise, then, that other people’s input is often insufficient to make us recognize our mistakes.
Here, though, is something more surprising. Although I finally admitted my own error on the basis of a barbed-wire fence, we are often equally reluctant to accept the suggestion that we are wrong when it comes to us from the physical world—a far more impartial and therefore (one might imagine) far more palatable source of feedback. These red flags in our environment are, in essence, a kind of forcing function—the engineer’s term of art for features of the physical world that alert us to the fact that we are making a mistake. If you’ve just emerged from the grocery store and are trying to get into a black Ford F–150 that happens to be someone else’s black Ford F–150, the key will not turn in the lock—one of a great many car-related forcing functions that have long been standard protocol in the automotive industry.
Forcing functions are, on the whole, quite effective. But they can’t stop you from, say, jiggling your key in the lock, twisting it almost to the breaking point, taking it out, looking at it, inserting it upside down, and finally giving up and heading over to try the passenger door—at which point, you note the presence of an unfamiliar diaper bag and the absence of your coffee mug, and the light dawns. As this example suggests, environmental feedback is not all that different from human feedback: it can draw attention to our errors, but it cannot force us to acknowledge them.* The fact is, with the exception of our own minds, no power on earth has the consistent and absolute ability to convince us that we are wrong. However much we might be prompted by cues from other people or our environment, the choice to face up to error is ultimately ours alone.
Why can we do this sometimes but not others? For one thing, as we saw earlier, it’s a lot harder to let go of a belief if we don’t have a new one to replace it. For another, as Leon Festinger observed in his study of cognitive dissonance, it’s a lot harder if we are heavily invested in that belief—if, to borrow a term from economics, we have accrued significant sunk costs. Traditionally, sunk costs refer to money that is already spent and can’t be recovered. Let’s say you shelled out five grand for a used car, and three weeks later it got a flat tire. When you take it to the mechanic, he tells you that you need both rear tires replaced and the alignment adjusted. Bang: you’ve just added 250 bucks to your ticket price. A month later, the clutch gives out. You get it fixed—for a cool $900—but pretty soon you start having trouble with the ignition. Turns out you need the fuel pump repaired. There goes another $350. Now you’ve spent $1,500 to keep your $5,000 lemon running.
So should you ditch the car and buy another one, or should you hope for the best and stick with the one you’ve got? An economist would say that, whatever you decide, you shouldn’t factor in the $6,500 you’ve already spent. That’s your sunk cost, and since the money is gone either way, a rational actor would ignore it. But human beings are famously bad at ignoring sunk costs, because we are not really rational actors. We are quasi-rational actors, in whom reason is forever sharing the stage with ego and hope and stubbornness and loathing and loyalty. The upshot is that we are woefully bad at cutting our losses—and not just when it comes to money. We are also seduced by the sunk costs of our actions: think about those mountain climbers who keep