Biomedical Engineering Reference
In-Depth Information
Then, PCA can be defined as follows:
1. The first principal direction
w
1
is the unit-norm vector maximizing the variance
or power of (
3.3
) as measured by the function:
2
T
Ψ
PCA
(
w
)=E
{z
}
=
w
R
x
w
.
(3.4)
2. The second principal direction
w
2
is the unit-norm maximizer of criterion (
3.4
)
lying orthogonal to the first principal direction
w
1
, i.e.,
w
2
w
1
=0
.
.
k
In general, the
k
th principal direction
w
k
∈R
L
is the unit-norm maximizer
of criterion (
3.4
) lying orthogonal to the previous principal directions
{
w
j
}
k−
1
j
=1
,
T
k
w
j
=0
,for
j<k
.
The principal components
i.e.,
w
, are obtained by replacing
w
with the
corresponding principal directions in Eq. (
3.3
). Simple algebraic manipulations
show that the
k
th principal direction
w
k
is the
k
th dominant eigenvector of the
data covariance matrix
R
x
defined in Eq. (
3.2
). Let
{z
1
,z
2
,...,z
k
}
T
R
x
=
UDU
(3.5)
denote its eigenvalue decomposition (EVD), where the columns of unitary matrix
U
∈R
L×L
contain the eigenvectors and diagonal matrix
D
= diag(
λ
1
,λ
2
,...,λ
L
)
=
u
1
,
u
2
,...,
u
L
]
∈R
L×L
stores the eigenvalues arranged in decreasing
order. Then the principal directions are found in the columns of
U
and the principal
components
z
=[
z
1
,z
2
,...,z
L
]
T
∈R
L
can be computed as
T
z
=
U
x
.
(3.6)
Thus, according to this decomposition, the original data are expressed as the product
of unitary matrix
U
of principal directions and vector
z
of principal components
with decreasing variance:
x
=
Uz
.
3.2.2.2
PCA as a Decorrelating Transform
Eqs. (
3.5
)-(
3.6
) prove that the covariance matrix of the principal components is
diagonal:
T
T
R
x
U
=
D
.
As a result, the principal components are uncorrelated:
R
z
=E
{
zz
}
=
U
E
{z
i
z
j
}
=[
R
z
]
ij
=0
,
for
i
=
j.
Hence, PCA can be considered as a decorrelating transform, whereby the original
data are transformed into uncorrelated components. Because of their decorrelation,
Search WWH ::
Custom Search