Intelligence_ From Secrets to Policy - Mark M. Lowenthal [90]
Just as analysts want to show the depth of their knowledge, so, too, they want to be perceived as experienced—perhaps far beyond what is true. Again, this is a common human failing. Professionals in almost any field, when surrounded by peers and facing a situation that is new to them but not to others, are tempted to assert their familiarity, whether genuine or not. Given the choice between appearing jaded (“been there, done that”) and naive (“Wow! I’ve never seen that before!”), analysts usually choose jaded. The risk of being caught seems small enough, and it is preferable to being put down by someone else who displays greater experience.
Sometimes, however, much is at stake. For example, in April 1986 the operators of the Chernobyl nuclear reactor in the Soviet Union, while running an unauthorized experiment, caused a catastrophic explosion. The next afternoon, Sweden reported higher than normal radioactive traces in its air monitors, which had been placed in many cities. In the United States an intelligence manager asked a senior analyst what he made of the Swedish complaints. The analyst played them down, saying the Swedes were always concerned about their air and often made such complaints for the smallest amounts of radiation. On learning the truth, analysts spent the following day frantically trying to catch up with the facts about Chernobyl. The jaded approach precluded the analysts from making the simplest inquiries such as into the types of radiation Sweden detected. The answer would have identified the source as a reactor and not a weapon. And the prevailing winds over Sweden could have been surveyed to identify the source. (Some years later the intelligence manager met with some of his Swedish counterparts. They had initially concluded, based on analysis of the radiation and wind conditions, that a reactor at nearby Ignalina, across the Baltic Sea in Soviet territory, was leaking. Although they misidentified the source, which was a reactor much farther away, they were much closer to the truth than were U.S. intelligence officials.)
The costs of the jaded approach are threefold. First, this approach represents intellectual dishonesty, something all analysts should avoid. Second, it proceeds from the false assumption that each incident is much like others, which may be true at some superficial level but may be false at fundamental levels. Third, it closes the analyst’s thinking, regardless of his or her level of experience, to the possibility that an incident or issue is entirely new, requiring wholly new types of analysis.
Credibility is one of the most highly prized possessions of analysts. Although they recognize that no one can be correct all of the time, they are concerned that policy makers are holding them accountable to an impossible standard. Their concern about credibility—which is largely faith and trust in the integrity of the intelligence process and in the ability of the analysts whose product is at hand—can lead them to play down or perhaps mask sudden shifts in analyses or conclusions. For example, suppose intelligence analysis has long estimated a production rate of fifteen missiles a year in a hostile state. One year, because of improved collection and new methodologies, the estimated production rate (which is still just an estimate) goes to forty-five missiles per year.