Geoscience Reference
In-Depth Information
Fig. 10.4 1-D example of
conditioning by kriging
In earlier implementations of TB and because of hardware
limitations, the position of the original N lines was evident in
the resulting simulated image. The solution to avoid artifacts
is to use a very large number N of lines, which is currently
more practical than in years past. Artifacts, particularly if the
method is poorly implemented, can be a significant disad-
vantage.
conditioning data and u i , i = 1, …, N, be the N nodes to be
simulated. The large covariance matrix (n + N)·(n + N) is par-
titioned into the data-to-data covariance matrix, the node-to-
node covariance matrix, and the two node-to-data covariance
matrices:
(
)
(
)
C
uu uu
C
¢
Y
α
β
Y
α
j
nn
nN
C
=
=⋅
LU
(
)
(
)
(
nN nN
+
)(
+
)
¢
¢
¢
C
uu uu
C
β
j
Nn
NN
10.2.3
LU Decomposition
The large matrix C is decomposed into the product of a
lower and an upper triangular matrix, C = L . U . A conditional
realization {y (l) ( u i ), i = 1, …, N} is obtained by multiplica-
tion of L  by a column matrix ω (N+n) ִ 1 (l) of normal deviates:
When the total number of conditioning data plus the number
of nodes to be simulated is small (fewer than a few hundred)
and a large number of realizations is requested, simulation
through LU decomposition of the covariance matrix pro-
vides the fastest solution (Luster 1985 ; Alabert 1987b ).
Let Y( u ) be the stationary Gaussian RF model with co-
variance C Y ( u ). Let u α , α = 1, …, n, be the locations of the 
[
] 1
y
()
u
ω
L
0
α
n
11
1
()
l
()
l
y
=
=⋅ =
L
ω
()
l
u ¢
LL
()
l
y
()
ω
21
22
2
i
N
1
Search WWH ::




Custom Search