Geoscience Reference
In-Depth Information
M s are theoretically expected and experimentally observed to saturate (around 6.3 and 8.2,
respectively) with increasing seismic moment and are essentially useless for estimating the size
of the mega-earthquakes capable of producing tsunamis that cause damage in the far ield.
(3) If all earthquakes obeyed scaling laws, the measurement of one or another magnitude
should in principle be equivalent, and an analyst should be able to predict the low-frequency
value of the seismic moment by measuring the source in a different frequency band. However,
earthquakes with similar moments produce widely scattered estimates of magnitude, and
“tsunami earthquakes” feature anomalous source characteristics. The observational challenge is
to somehow identify those in real time.
(4) While the goal of tsunami warning is to quantify (e.g., hypocenter, magnitude, focal
mechanism, and fault extent) the earthquake as quickly as possible upon detection, it is also
imperative to record the source in its entirety in order to assess its full tsunamigenic potential.
Bearing in mind, for example, that the source of the 2004 Sumatra earthquake lasted eight
minutes, we realize that assessing its size within ive minutes is at best a challenge and at worst
an impossible task. Unfortunately, there is really no consensus among seismologists as to the
deterministic nature of earthquake rupture, namely whether the early stages of nucleation
of a large earthquake carry a ingerprint of the eventual true size of the event. Indeed, several
examples of delayed sources (e.g., 2001 Peru and 2006 Kuril Islands, both having generated
destructive tsunamis) reveal a sudden increase in seismic moment release as late as one or two
minutes into their source process; they constitute another class of events violating scaling laws.
In lay terms, at the initiation of a seismic rupture, does Mother Nature really know how large
the inal product will be? Yet it is that inal product that will control the tsunami and that the
watchstanders at the Tsunami Warning Centers (TWCs) are charged with estimating, as swiftly
and as reliably as possible.
5. Seismic data and sophisticated processing are insuficient to determine the destructive-
ness of tsunamis. Guisiakov uses the Soloviev-Imamura tsunami intensity scale based on run-
up data to show there is only a tendency of increased tsunamis with an increase in earthquake
magnitude. The lack of direct correlation can be attributed in part to secondary mechanisms
(submarine slumps and slides) in the generation of tsunamis. This is shown in the indings by
Plafker where submarine landslides account for many large and destructive tsunamis.
THE M wP ALGORITHM
The application of geometrical optics to seismology reveals that the earth's ground motion
resulting from the passage of P-waves in the far-ield is related to the time derivative of the his-
tory of the deformation or physical slip at the source. In other words, if a permanent deforma-
tion (in the form of a step in displacement) is incurred at the epicenter, the far-ield signal will
register an impulse (or spike) of short duration, followed by a return to quiescence. Conversely,
the deformation at the source should be obtainable by mathematically integrating the ground
displacement over time in the far ield, and by performing a number of theoretically justiiable
corrections, which account, for example, for the path from epicenter to receiver. As most seis-
Search WWH ::




Custom Search