Pox_ An American History - Michael Willrich [197]
On December 13, 2002, President George W. Bush announced his administration’s plan to protect the nation from a smallpox attack. The plan, which many in the scientific community had opposed, involved compulsory vaccination of a half-million U.S. military personnel, followed by a voluntary campaign of a roughly equal number of frontline hospital workers and members of public health departments—the most likely health workers to come into contact with the virus during an outbreak. After that, the plan called for the voluntary vaccination of some 10 million firefighters, police, and other “first responders.” The military vaccination campaign went smoothly enough. But the civilian campaign quickly collapsed. Only 38,000 health workers agreed to be vaccinated, and many American hospitals refused to participate at all.15
The complex concerns elicited by the civilian program would have been familiar to the many Americans who refused vaccination at the turn of the twentieth century. Many of the health workers believed they had a specific medical condition that made smallpox vaccination particularly hazardous for them. (In fact, experts believe as many as one in five Americans today may have contraindications to smallpox vaccination, including immune systems weakened by HIV.) Others worried about the common side effects of smallpox vaccine—still known as “the most dangerous vaccine.” Many felt the risk of a bioterrorist attack was too low to make getting vaccinated a good bet. (The invasion of Iraq had revealed that Saddam Hussein held no secret stockpile of variola.) Another key factor was the lack of a federal program, in the first stages of the vaccination campaign, to compensate people for death, injury, or lost work due to the vaccination. In the end, the failed civilian program reported nearly nine hundred adverse reactions to vaccine, including one death. The military program reported seventy-five cases of heart inflammation and one death.16
It was a revealing episode. In the absence of a palpable threat of an outbreak, few twenty-first-century Americans would step forward and get vaccinated against smallpox. Clearly, ignorance had little to do with it. Presumably, the 400,000 health workers who declined to roll up their sleeves were exceptionally well-informed about the risks. Even the relatively small risks of the vaccine were deemed unacceptable as long as the threat of a smallpox attack seemed remote.
Even as smallpox itself disappeared from America and the world in the final decades of the twentieth century, vaccines themselves proliferated. Thanks in large part to the polio success story, so did vaccine laws. By the century’s end, all fifty states mandated that children receive immunization shots to protect them against seven different diseases. The number continues to grow. State-mandated vaccination is far more extensive than it was a century ago. But most states now provide precisely the sort of exemptions that the turn-of-the-century antivaccinationists in Europe and the United States had demanded. The people may now ask to be exempted for medical and religious reasons, or even, in some states, for conscientious objections to vaccination.17
For all of this, public distrust of vaccines is on the rise, caused in part by the unprecedented complexity of the childhood immunization landscape and fueled by the explosive communicative power of the Internet. No longer do rumors of sore arms and lost limbs circulate via word of mouth across communities of workers; a bottomless archive of information and misinformation about vaccines is just a few keystrokes away.