Civil Engineering Reference
In-Depth Information
Mathematical Background
In this subsection, we summarize the background of the Karhunen-Loève approach. A random process
x ( t ), defined in the time domain (or spatial domain) (0, T ), can be expressed as a linear combination of
orthogonal functions [19]:
N
xt
()
y i i t
()
,
(1.19)
i
1
i ( t ) are deterministic functions, and the i th coefficient vector y i contains
random variables y j . Eq. (1.19) is the Karhunen-Loève expansion of the original data, composed of a
linear combination of the basis functions and the feature coefficients in the transformed domain.
To use this approach with real signals in the discrete domain, we take N time-sampled values of the
time functions and convert them to vectors as shown in the following equations [19]:
where the orthogonal functions
T
Xx t ()…
[
xt ()
]
(1.20)
T
[
i t ()… i t ()
]
,
(1.21)
where each time-sampled value x( t i ) is a random variable.
Assuming zero-mean random processes, the covariance matrix equals the autocorrelation matrix, and
is computed as R ( t ,
x
)
E [ x ( t )(
)]. If
i ( t k ) are the eigenfunctions of R ( t l , t k ), they must satisfy the
following characteristic equation:
n
Rt l t k
(
,
) i t ()
i i t ()
,
(1.22)
k
1
where i , l
1, 2, …, N . Equation (1.22) can be written in matrix form to define the eigenvalues
i and
eigenvectors
i as follows:
S
i
i i ,
(1.23)
where i
1, 2,…, N and S is the N
N covariance matrix defined by:
Rt 1 , t 1
(
)…
Rt 1 , t N
(
)
.
.
S
(1.24)
.
.
.
.
Rt N , t 1
(
)…
Rt N , t N
(
)
The covariance matrix equation is solved to obtain its eigenvalues and corresponding eigenvectors. The
eigenvectors are orthonormal and hence need to satisfy , where I is the identity matrix. The
eigenvalues are ranked in order of significance, as compared to the total energy represented by the sum
of all eigenvalues. The eigenvectors corresponding to the dominant eigenvalues represent the fundamental
functions in the transformed domain. The Karhunen-Loève coefficients y i are computed by projecting
the original data onto the new domain, represented by the dominant basis functions:
T
I
i T X,
y i
(1.25)
[ y 1 y M ], corresponding to each eigenvector, and where M is
the number of input snapshots. The resulting vector
where we have one coefficient vector y i
T
Yy i
[
y N
]
contains the N coefficient vectors
Search WWH ::




Custom Search