Chaos - James Gleick [9]
By the 1980s a vast and expensive bureaucracy devoted itself to carrying out Von Neumann’s mission, or at least the prediction part of it. America’s premier forecasters operated out of an unadorned cube of a building in suburban Maryland, near the Washington beltway, with a spy’s nest of radar and radio antennas on the roof. Their supercomputer ran a model that resembled Lorenz’s only in its fundamental spirit. Where the Royal McBee could carry out sixty multiplications each second, the speed of a Control Data Cyber 205 was measured in megaflops, millions of floating-point operations per second. Where Lorenz had been happy with twelve equations, the modern global model calculated systems of 500,000 equations. The model understood the way moisture moved heat in and out of the air when it condensed and evaporated. The digital winds were shaped by digital mountain ranges. Data poured in hourly from every nation on the globe, from airplanes, satellites, and ships. The National Meteorological Center produced the world’s second best forecasts.
The best came out of Reading, England, a small college town an hour’s drive from London. The European Centre for Medium Range Weather Forecasts occupied a modest tree-shaded building in a generic United Nations style, modern brick-and–glass architecture, decorated with gifts from many lands. It was built in the heyday of the all-European Common Market spirit, when most of the nations of western Europe decided to pool their talent and resources in the cause of weather prediction. The Europeans attributed their success to their young, rotating staff—no civil service—and their Cray supercomputer, which always seemed to be one model ahead of the American counterpart.
Weather forecasting was the beginning but hardly the end of the business of using computers to model complex systems. The same techniques served many kinds of physical scientists and social scientists hoping to make predictions about everything from the small-scale fluid flows that concerned propeller designers to the vast financial flows that concerned economists. Indeed, by the seventies and eighties, economic forecasting by computer bore a real resemblance to global weather forecasting. The models would churn through complicated, somewhat arbitrary webs of equations, meant to turn measurements of initial conditions—atmospheric pressure or money supply—into a simulation of future trends. The programmers hoped the results were not too grossly distorted by the many unavoidable simplifying assumptions. If a model did anything too obviously bizarre—flooded the Sahara or tripled interest rates—the programmers would revise the equations to bring the output back in line with expectation. In practice, econometric models proved dismally blind to what the future would bring, but many people who should have known better acted as though they believed in the results. Forecasts of economic growth or unemployment were put forward with an implied precision of two or three decimal places. Governments and financial institutions paid for such predictions and acted on them, perhaps out of necessity or for want of anything better. Presumably they knew that such variables as “consumer optimism” were not as nicely measurable as “humidity” and that the perfect differential equations had not yet been written for the movement of