Civil Engineering Reference
In-Depth Information
FIGURE 1.10
KL decomposition of linear vectors.
ˆ
1
--
2
Z i Z i T
i 1
The covariance matrix is computed using:
, which becomes:
1 2
1
--
11
11
11 21
ˆ
[
11
21
]
21 11 21
[
]
.
(1.27)
21
2 2
21 11
The eigenvalues are computed using det ( S
I )
0, which becomes:
1 2
(
)
11 21
1 2
) 2 2
1 2
2 2
0
(
(
)
0
2 2
21 11
2
1 2
2 2
(
)
0
(1.28)
1 2
2 2
2
The solution of this characteristic polynomial in
is
1
0 or
, resulting in only one
dominant eigenvalue. The corresponding eigenvector is computed using ( S
i I )
i
0, which becomes:
1 2
1 2
2 2
2 2
(
)
11 21
1
2
11 21
1
2
0
0
2 2
1 2
2 2
1 2
21 11
(
)
21 11
2 2
1
11 21 2
0
11
21
------ 2
(1.29)
1
which is the equation for a straight line. Since the eigenvector has to be orthonormal, the orthonormality
condition
21
11
T
2
2
I implies
1
, which results in
2
and
1
, as the
---------------------------
---------------------------
1 2
2 2
1 2
2 2
normalized linear vector
.
General Multi-Component Functions
The insights gained from studying linear vectors can help in understanding the significance of the
Karhunen-Loève transform. As demonstrated above, the transform amounts to solving a familiar eigen-
vector equation to determine the fundamental eigenvectors in the new transform domain.
Detection of linear vectors is the first step in decomposing a general function g ( x , t ) into its fundamental
functions. Linear trends often obscure data; for example, linear trends appear falsely as a low frequency
component using Fourier transform techniques. Typically, pre-processing of the data is necessary to
remove the linear trends by means of linear regression techniques, as shown in the fractal-wavelet
technique. Using the Karhunen-Loève tranform, we aim to develop a unified decomposition method
 
Search WWH ::




Custom Search