Cascadia's Fault - Jerry Thompson [91]
What Eddie Bernard and a team of more than twenty-five PMEL engineers, technicians, and scientists, along with eighty-five partner companies and suppliers, came up with was a four-stage warning system they called a “tsunameter,” which does for wave detection what seismographs do for earthquake measurement.
They started with a device that records pressure changes at the bottom of the ocean. Waves whipped up by storms or hurricanes affect only the surface layer of the sea. A subduction earthquake lifts the entire water column from bottom to top. When a mound of seawater several miles deep is lifted and breaks into a series of waves that start to roll across the ocean floor, the weight of all that water can be measured as a change in pressure when the wave passes over the bottom pressure recorder (BPR) developed by the team at PMEL. The BPR had to be able to function under almost twenty thousand feet (6,000 m) of water without needing maintenance for at least two years.
The second stage of the system involved an acoustical transmission device that could beam the pressure data up to a buoy tethered by cable at the surface. This turned out to be the greatest engineering challenge of all, although they eventually found a way to do it. A deep-ocean buoy technology had already been developed for NOAA’s Tropical Atmosphere and Ocean weather forecasting system, but the gear needed modifications to make sure it could survive the frequent and more severe storms of the North Pacific.
In the third stage, the buoys would relay the pressure data from the BPR to an orbiting satellite that would beam the signal back to land. In the fourth and final stage, the data would be received and processed at the two Pacific tsunami warning centers.
That was the plan. Making it happen was something else. They had to build and deploy a new generation of buoys that could withstand an entire year on the wild and turbulent surface of the North Pacific. The equipment for each tsunameter—the BPR, acoustical transmitter, buoy, and satellite relay—cost roughly $250,000, plus another $30,000 per year for maintenance. The most expensive part of the process, however, would be delivering and anchoring the buoy systems in the deep ocean, using ships that cost roughly $22,000 per day to operate.
A prototype to be deployed two hundred miles (320 km) off the coast of Oregon was ready to go by September 1997. It quickly delivered an accurate stream of data, so NOAA decided to install two more. It would take eight different ships on eighteen cruises—more than ninety days of sea time—to set up this initial three-station array. The good news was that it worked better than expected. It was transmitting tsunami data with a reliability factor of 97 percent—much higher than the 80 percent success rate they had hoped for from a prototype.
That was just the beginning. Since the Ring of Fire’s subduction zones constantly eat slabs of sea floor, there was an enormous amount of real estate to cover: more than 5,600 miles (9,000 km) of plate boundaries and grinding trenches that could create large earthquakes and trigger tsunamis. NOAA figured they would need buoys spaced 125 to 250 miles (200–400 km) apart to “reliably assess the main energy beam” of a tsunami generated by a magnitude 8 event. Full coverage would require deployment of twenty-five to fifty tsunameter stations.
NOAA, the USGS, FEMA, and the five Pacific states that funded the project realized early on that if and when the network were finally built to its full length, it would still be too small. With buoys this far apart, some smaller but nonetheless destructive tsunamis could slip through the gaps undetected. This floating line