Mistakes Were Made - Carol Tavris [75]
Yet training that promotes the certainties of pseudoscience, rather than a humbling appreciation of our cognitive biases and blind spots, increases the chances of wrongful convictions in two ways. First, it encourages law-enforcement officials to jump to conclusions too quickly. A police officer decides that a suspect is the guilty party, and then closes the door to other possibilities. A district attorney decides impulsively to prosecute a case, especially a sensational one, without having all the evidence; she announces her decision to the media; and then finds it difficult to back down when subsequent evidence proves shaky. Second, once a case is prosecuted and a conviction won, officials will be motivated to reject any subsequent evidence of the defendant’s innocence.
The antidote to these all-too-human mistakes is to ensure that in police academies and law schools, students learn about their own vulnerability to self-justification. They must learn to look for the statistically likely suspect (a jealous boyfriend) without closing their minds to the statistically less likely suspect, if that is where some evidence leads. They need to learn that even if they are confident that they can tell if a suspect is lying, they could be wrong. They need to learn how and why innocent people can be induced to confess to a crime they did not commit, and how to distinguish confessions that are likely to be true from those that have been coerced. 44 They need to learn that the popular method of profiling, that beloved staple of the FBI and TV shows, carries significant risks of error because of the confirmation bias: When investigators start looking for elements of a crime that match a suspect’s profile, they also start overlooking elements that do not match. In short, investigators need to learn to change trees once they realize they are barking up the wrong one.
Law professor Andrew McClurg would go further in the training of police. He has long advocated the application of cognitive-dissonance principles to keep highly motivated rookies from taking that first step down the pyramid in a dishonest direction, by calling on their own self-concept as good guys fighting crime and violence. He proposes a program of integrity training in dealing with ethical dilemmas, in which cadets would be instilled with the values of telling the truth and doing the right thing as a central part of their emerging professional identity. (Currently, in most jurisdictions, police trainees get one evening or a couple of hours on dealing with ethical problems.) Because such values are quickly trumped on the job by competing moral codes—”You don’t rat on a fellow officer”; “In the real world, the only sure way to get a conviction is to fudge the truth”—McClurg proposes that rookies be partnered with experienced, ethical mentors who, in the manner of Alcoholics Anonymous sponsors, would help rookies maintain their honesty commitment. “The only hope of substantially reducing police lying is a preventative approach aimed at keeping good cops from turning bad,” he argues. Cognitive dissonance theory offers “a potent, inexpensive, and inexhaustible tool for accomplishing this goal: the