Biomedical Engineering Reference
In-Depth Information
In order for a family of functions to be orthonormal under the inner product,
they must meet two criteria. It must be the case that for any i , j , l , and m where i
l
and j
1, where < f , g > is the inner product and
is defined as in (6.6) and f ( x ) * is the complex conjugate of f ( x ):
m that
<
ij ,
>≥
0 and
<
ij ,
>≥
lm
ij
()()
*
fg
,
= −∞
f x gxdx
(6.6)
The wavelet basis is very similar to the Fourier basis, with the exception that the
wavelet basis does not have to be infinite. In a wavelet transform the basis functions
can be defined over a certain window and then be zero everywhere else. As long as
the family of functions defined by scaling and translating the mother wavelet is
orthonormally complete, that family of functions can be used as the basis. With the
Fourier transform, the basis is made up of sine and cosine waves that are defined
over all values of x where
.
One of the simplest wavelets is the Haar wavelet (Daubechies 2 wavelet). In a
manner similar to the Fourier series, any continuous function f ( x ) defined on [0, 1]
can be represented using the expansion shown in (6.7). The h j , k ( x ) term is known as
the Haar wavelet function and is defined as shown in (6.8); p j , k ( x ) is known as the
Haar scaling function and is defined in (6.9) [17]:
−∞ <
x
<∞
j
J
21
21
()
()
()
fx
=
fh
,
h
x
+
fp
,
p
x
(6.7)
jk
,
jk
,
Jk
,
Jk
,
jJ
=
k
=
0
k=
0
j
/
2
j
2
if
if
otherwise
0
2
xk
xk
− <
1 2
()
j
/
2
j
hx
=
2
1 2
2
− <
1
(6.8)
jk
,
0
J
2
j
2
if
otherwise
0
2
xk
− <
1
()
px
=
(6.9)
Jk
,
0
The combination of the Haar scaling function at the largest scale, along with the
Haar wavelet functions, creates a set of functions that provides an orthonormal
basis for functions in
2
.
Wavelets and short-term Fourier transforms also serve as the foundation for
other measures. Methods such as the spectral entropy method calculate some fea-
ture based on the power spectrum. Entropy was first used in physics as a thermody-
namic quantity describing the amount of disorder in a system. Shannon extended its
application to information theory in the late 1940s to calculate the entropy for a
given probability distribution [18]. The entropy measure that Shannon developed
can be expressed as follows:
=−
Hp
log
p
(6.10)
k
k
Entropy is a measure of how much information there is to learn from a random
event occurring. Events that are unlikely to occur yield more information than
events that are very probable. For spectral entropy, the power spectrum is consid-
Search WWH ::




Custom Search