Civil Engineering Reference
In-Depth Information
understanding of the phenomenological aspects by proposing design values related
to them, despite the complexity of the phenomenon governed by the Theory of
Chaos and the enormous number of unknowns.
The criticism to the deterministic approach refers to the fact that the proposed
design values of seismic actions are the maximum ones, which probably could
never be reached during the structure life. Contrary, the probabilistic approach is
criticized for the reduced number of records at the same site related to the same
source.
In this dispute, some examples can be very suggestive, like for instance the
subduction fault in Northern California through Oregon and Washington into
British Columbia and the New Madrid area in Eastern North America. In the first
case, during the past decades since the seismic activity has been followed, no
earthquakes with magnitude greater than 6.0 have been recorded. But, geological
studies have shown that some giant earthquakes have struck the region in the past
7000 years. The last one was produced in January 1700 with a magnitude M = 9.0,
very close to maximum possible value. In the second case, during the period 1811-
1812 three very strong earthquakes of 7.6 to 7.9 shook the Missouri US area. After
these former events, the zone remains almost quiet. According to the probabilistic
approach, these situations could be ignored, but interpreting the geological aspects
through the deterministic approach, the possibility of occurrence of a very
important earthquake has to be considered. In exchange, as a lot of recorded
ground motions exist for both the earthquakes produced along the San Andreas
(US) and in the Vrancea (Romania) zones, the Statistical Seismology can be very
usefully applied in these cases. A different situation corresponds to the case of the
Balkan earthquakes, where the locations of the epicenters are very diffuse, two
events never being recorded in the same place.
These very different aspects show that there are situations where the positions
of sources are very well identified; the Statistical Seismology can be very useful to
define the design earthquakes in this case. But in case of diffuse earthquakes, a
different approach must be used to predict these earthquakes. This is the task of the
new developed Theory of Multi-Source Data Fusion (Semerdjiev, 1999, Leebmann
and Kiema, 2000, Yager, 2004). This new field of statistical analysis is devoted to
obtain information of better quality by data fusion originated from different
sources. This is a general theory, which can be used in very different fields of
engineering. In the seismic field, this theory is based on the evaluation of tectonic,
geological, seismological and geotechnical data, for different sources. If these
characteristics of some sources are similar, the number of valuable data for
statistical analysis increases. For instance, the San Andreas, North Anatolian and
New Zealand faults have the same characteristics: crustal shallow interplate and
strike-slip type. Therefore the data used for the more studied Californian
earthquakes can be used also for the North Anatolian earthquakes, but not for other
seismic zones. A very frequent mistake is to use the well-known El Centro record,
with characteristics corresponding to strike-slip earthquakes, for designing
buildings located in Italy, Romania or Greece, with very different earthquake
characteristics. Some seismic areas of North America, Europe, Asia and Australia
present shallow intraplate, normal dip-slip type. The differences in the earthquake
Search WWH ::




Custom Search