When it comes to earthquakes, about the only sure thing is that the first century of seismic strides won’t hold a candle to upcoming third-millennium milestones, thanks to high-octane techno-tools. The marvels of supercomputing are already catapulting quake prediction, protection and preparedness efforts from the era of enlightenment to the era of empowerment, and it is only 2006.
Quake experts, not knowing when a forceful temblor will strike, aren’t sitting around waiting for any seismic utopia. On the contrary, they are using the centenary of San Francisco’s April 18, 1906, cataclysm to call for bigger bucks for bigger efforts to take some of the punch out of the next Big One.
|Making Waves. Using third-millennium technology, researchers “mapped” San Francisco’s 1906 quake. (Photo courtesy of Scripps Institution of Oceanography)|
“We are working very hard to deliver a common message about...what needs to be done to make things better,” says Chris D. Poland, chairman, president and CEO of Degenkolb Engineers, San Francisco, and chair of the 100th Anniversary Earthquake Conference. The April 18-22 meeting in San Francisco, to raise public awareness, was convened by the Earthquake Engineering Research Institute (EERI), the Seismological Society of America (SSA) and the California Governor’s Office of Emergency Services (OES).
To increase the power of the conference message, EERI, SSA, OES and the U.S. Geological Survey commissioned a study, called Managing Risk in Earthquake Country. The effort, led by Charles Kircher, of the Palo Alto, Calif., consulting engineering firm that bears his name, estimates losses for a repeat of the 1906 quake and sets an action plan for northern California.
Poland declines to reveal details of the report, which will be released April 17, other than to say that estimates show there would be thousands of casualties, hundreds of thousands of people left homeless and damage in excess of $100 billion, in the 19-county area affected.
Conference organizers are presenting the study to local, state and federal government officials and politicians. The 10-step plan calls for developing a culture of preparedness, investing in reducing losses and ensuring resilience in recovery.
Earthquake preparedness activities go beyond northern California. Efforts are also gaining momentum in the seismic Pacific Northwest and in Southern California . On a less-anxiety-producing level, the seismic community is also developing better ways for engineering and science to conquer nature.
Progress has been boosted, even in the last five years, by leaps in communications and computer technology, including software. “In a few years, we will be able to answer questions that a few years ago, we wouldn’t dare pose,” says Farzad Naeim, vice president and general counsel for structural consultant John A. Martin & Associates, Los Angeles.
For seismic research and testing, this means empowerment through simultaneous collaboration among simulation laboratories across the nation (see p. 26). For buildings, technology is opening the door to tailored performance-based seismic design, not only for retrofits but for new buildings. Computer power is also underlying the metamorphosis of seismic bridge engineering from an art to a science.
And this is just the beginning of the era of empowerment. Many agree that even greater things are coming.
In the next 10 or so years, work will mature on probabilistic analysis incorporating fragility and cost models to advance the field of hazard mitigation, says Nabih Youssef, of the Los Angeles structural firm bearing his name. Infrastructure stakeholders will be able to incorporate these models into global seismic design strategies and more appropriately evaluate critical facilities to restore functionality after major seismic events, he says. In coming decades, Youssef thinks fault zone earthquake simulation work will lead to the next evolution of ground motion characterizations.
A major leap forward for structures, coming soon, is rational performance-based seismic design. The methodology integrates scientific information on seismic hazards with experimentally developed behavior to attain a desired performance objective—way beyond life safety.
“I think this is the biggest advance on the horizon,” says Jack P. Moehle, professor of structural engineering at the University of California, Berkeley, and director of the Pacific Earthquake Engineering Research Center.
|Simulation. Performance design can tailor frame to specific performance criteria. (Photo courtesy of Nabih Youssef Associates)|
The approach allows stakeholders to make informed decisions about seismic hazard and risk from quantifiable results associated with building damage as a percentage of replacement cost, economic disruption and casualties. “It’s about the three D’s of performance design,” says Naeim. “Death, dollars and downtime.”
The Federal Emergency Management Agency and ATC are sponsoring a project, called ATC-58, that will provide engineers with a methodology to calculate expected losses in an unambiguous manner and relate them to specific design decisions, says Ronald O. Hamburger, a principal of structural engineer Simpson Gumpertz & Heger, San Francisco, and the leader of ATC-58. Hamburger expects the document to be available in four years.
An even more fledgling effort, by a concrete coalition under EERI, would try to come to the rescue of nonductile concrete frames, considered hazards in seismic zones. One coalition goal is to improve the engineer’s ability to predict which buildings will collapse. Another is to mobilize political resources to influence legislation to require retrofits of truly hazardous structures. A third is to work with financial interests to develop incentives to encourage voluntary retrofits.
The coalition has its work cut out for it. But so do all the West Coast seismic professionals.