Chaos - James Gleick [6]
To most serious meteorologists, forecasting was less than science. It was a seat-of–the-pants business performed by technicians who needed some intuitive ability to read the next day’s weather in the instruments and the clouds. It was guesswork. At centers like M.I.T., meteorology favored problems that had solutions. Lorenz understood the messiness of weather prediction as well as anyone, having tried it firsthand for the benefit of military pilots, but he harbored an interest in the problem—a mathematical interest.
Not only did meteorologists scorn forecasting, but in the 1960s virtually all serious scientists mistrusted computers. These souped-up calculators hardly seemed like tools for theoretical science. So numerical weather modeling was something of a bastard problem. Yet the time was right for it. Weather forecasting had been waiting two centuries for a machine that could repeat thousands of calculations over and over again by brute force. Only a computer could cash in the Newtonian promise that the world unfolded along a deterministic path, rule-bound like the planets, predictable like eclipses and tides. In theory a computer could let meteorologists do what astronomers had been able to do with pencil and slide rule: reckon the future of their universe from its initial conditions and the physical laws that guide its evolution. The equations describing the motion of air and water were as well known as those describing the motion of planets. Astronomers did not achieve perfection and never would, not in a solar system tugged by the gravities of nine planets, scores of moons and thousands of asteroids, but calculations of planetary motion were so accurate that people forgot they were forecasts. When an astronomer said, “Comet Halley will be back this way in seventy-six years,” it seemed like fact, not prophecy. Deterministic numerical forecasting figured accurate courses for spacecraft and missiles. Why not winds and clouds?
Weather was vastly more complicated, but it was governed by the same laws. Perhaps a powerful enough computer could be the supreme intelligence imagined by Laplace, the eighteenth-century philosopher-mathematician who caught the Newtonian fever like no one else: “Such an intelligence,” Laplace wrote, “would embrace in the same formula the movements of the greatest bodies of the universe and those of the lightest atom; for it, nothing would be uncertain and the future, as the past, would be present to its eyes.” In these days of Einstein’s relativity and Heisenberg’s uncertainty, Laplace seems almost buffoon-like in his optimism, but much of modern science has pursued his dream. Implicitly, the mission of many twentieth-century scientists—biologists, neurologists, economists—has been to break their universes down into the simplest atoms that will obey scientific rules. In all these sciences, a kind of Newtonian determinism has been brought to bear. The fathers of modern computing always had Laplace in mind, and the history of computing and the history of forecasting were intermingled ever since John von Neumann designed his first machines at the Institute for Advanced Study in Princeton, New Jersey, in the 1950s. Von Neumann recognized that weather modeling could be an ideal task for a computer.
There was always one small compromise, so small that working scientists usually forgot it was there, lurking in a corner of their philosophies