Being Wrong - Kathryn Schulz [212]
* This is true not just in politics but in any realm that requires swift, frequent, and firm decision making. Take sports: in a New York Times article on umpiring, the writer Joseph Berger observed that, “With baseballs flying at speeds faster than cars on a highway, umpires sometimes make mistakes—what referee hasn’t? But they must remain unflinching. Admit you’re wrong and chaos—or, worse, ridicule—can ensue.” Berger quotes one umpire who notes that, “A good official always comes strong with his calls. He’s always able to sell it, even if he realizes he’s made a mistake.”
† Safire would want me to point out that these terms are not interchangeable. Accusing someone of flip-flopping (changing positions on an issue) is not the same as accusing him of waffling (being indecisive) or of being wishy-washy (seeming weak). Still, these terms are often deployed together in service of a larger accusation: that their target has too many thoughts and too few convictions.
* As Festinger described it, cognitive dissonance is the uncomfortable feeling that results from simultaneously holding two contradictory ideas. This dissonance can arise from a conflict between a belief and its disconfirmation (“the spaceship will land on Tuesday,” “no spaceship landed on Tuesday”), or between a belief and a behavior (“smoking is bad for you”; “I’m on my second pack of the day”). Festinger proposed that there are two ways to ameliorate this uncomfortable feeling. The most direct way is to change your mind or your actions, but this can be difficult if you are heavily invested in the disproved belief or heavily dependent on the contraindicated behavior. The other option—more contorted, but sometimes more comfortable—is to convince yourself and others that the false belief isn’t really false, or that the harmful behavior isn’t all that harmful. This is why heightened adamancy and evangelism are not uncommon in the face of disconfirmed beliefs—as we will soon see.
* As Kuhn observed, “All historically significant theories have agreed with the facts, but only more or less. There is no more precise answer to the question whether or how well an individual theory fits the facts. But…[i]t makes a great deal of sense to ask which of two actual and competing theories fits the facts better.” Kuhn was talking about formal scientific theories, but the same generally goes for lay beliefs as well.
* At her request, I have changed her name and some of her biographical details.
† A different and somewhat less acerbic translation of this passage renders “servile conformism” as “blind belief.” In either case, al-Ghazali’s argument is essentially a restatement of the ’Cuz It’s True Constraint. Once we come to feel that we believed something for reasons other than the truth of that belief, we have all but destroyed our ability to keep believing it.
* For another example of ignoring input from our physical environment, consider my sister, who is one of those otherwise brilliant people who, for some reason, can’t find her way out of a paper bag. (I can recall her getting lost in a restaurant and a shoe store, and I strongly suspect that she could become disoriented in a mid-sized airplane.) Once, after returning from a meeting in her own building, she rounded a corner expecting to come across the door to her office, but instead found herself facing a corridor with a window at the end. That’s a pretty straightforward example of our environment giving us information that we are wrong—but, my sister said, “the first thing that came to mind wasn’t, I’m lost. It was, Who put that window there?” As always, the possibility that we ourselves have fucked up is the hypothesis of last resort.
* In a 2008 post to the blog she writes for The Atlantic, Megan McArdle,