Pale Blue Dot - Carl Sagan [114]
But now imagine a totalitarian state not overrun by enemy troops, but one thriving and self-confident. Imagine a tradition in which orders are obeyed without question. Imagine that those involved in the operation are supplied a cover story: The asteroid is about to impact the Earth, and it is their job to deflect it—but in order not to worry people needlessly, the operation must be performed in secret. In a military setting with a command hierarchy firmly in place, compartmentalization of knowledge, general secrecy, and a cover story, can we be confident that even apocalyptic orders would be disobeyed? Are we really sure that in the next decades and centuries and millennia, nothing like this might happen? How sure are we?
It’s no use saying that all technologies can be used for good or for ill. That is certainly true, but when the “ill” achieves a sufficiently apocalyptic scale, we may have to set limits on which technologies may be developed. (In a way we do this all the time, because we can’t afford to develop all technologies. Some are favored and some are not.) Or constraints may have to be levied by the community of nations on madmen and autarchs and fanaticism.
Tracking asteroids and comets is prudent, it’s good science, and it doesn’t cost much. But, knowing our weaknesses, why would we even consider now developing the technology to deflect small worlds? For safety, shall we imagine this technology in the hands of many nations, each providing checks and balances against misuse by another? This is nothing like the old nuclear balance of terror. It hardly inhibits some madman intent on global catastrophe to know that if he does not hurry, a rival may beat him to it. How confident can we be that the community of nations will be able to detect a cleverly designed, clandestine asteroid deflection in time to do something about it? If such a technology were developed, can any international safeguards be envisioned that have a reliability commensurate with the risk?
Even if we restrict ourselves merely to surveillance, there’s a risk. Imagine that in a generation we characterize the orbits of 30,000 objects of 100-meter diameter or more, and that this information is publicized, as of course it should be. Maps will be published showing near-Earth space black with the orbits of asteroids and comets, 30,000 swords of Damocles hanging over our heads—ten times more than the number of stars visible to the naked eye under conditions of optimum atmospheric clarity. Public anxiety might be much greater in such a time of knowledge than in our current age of ignorance. There might be irresistible public pressure to develop means to mitigate even nonexistent threats, which would then feed the danger that deflection technology would be misused. For this reason, asteroid discovery and surveillance may not be a mere neutral tool of future policy, but rather a kind of booby trap. To me, the only foreseeable solution is a combination of accurate orbit estimation, realistic threat assessment, and effective public education—so that in democracies at least, the citizens can make their own, informed decisions. This is a job for NASA.
Near-Earth asteroids, and means of altering their orbits, are being looked at seriously. There is some sign that officials in the Department of Defense and the weapons laboratories