Graphics Reference
In-Depth Information
p
q
1
s
===
μ
=
I
(2)
ijk
pqs
i
111
j
k
Use the average to centralize every image matrix, and compute the covariance
matrix of sample space:
1
p
q
s
===
T
C
=
(
I
−
μ
)
(
I
−
μ
)
(3)
ijk
ijk
pqs
i
111
j
k
Compute all the orthonormalized eigenvectors of covariance matrix
C
and
corresponding eigenvalues, then, the eigenvectors corresponding to largest
ʵ
eigenvalues are elected to form a matrix
E
, the size of
E
is
m
/
b
×
ʵ
, and
E
×
E
T
is an
identity matrix. The projection of a matrix form sample space to eigen space is:
)
T
δ
=
E
(
I
−
μ
(4)
Where
I
ijk
is the (
ijk
)th image matrix in sample space, that is also the
k
th sub-block
of
j
th sample in
i
th class, and
δ
ijk
is the corresponding projection matrix in eigen space.
The mean projection of a sub-block in same class is computed as:
ijk
ijk
q
1
=
η
=
δ
(5)
ik
ijk
q
j
1
Every
m
/
b
×
n
/
b
dimensional matrix in sample space represent as a
ʵ
×
n
/
b
dimensional
matrix in eigen space,
ʵ
≤
m
/
b
(generally
ʵ
is much less than
m
/
b
).
2.2
Classification Phase
A character image needs to be recognized is a considered as a test sample, the kth
sub-block of test sample is a matrix
I
test,k
, the low dimensional projection of which in
eigen space is:
η
=
E
T
(
I
−
μ
)
(6)
test,
k
test,
k
Euclidean distance from
I
test
to
i
th class in training database is computed as:
=
1
s
d
=
η
−
η
(7)
i
test,
k
ik
s
k
1
At last, the test sample could be classified to
Γ
th class when:
)
Γ
(8)
Where
D
is a set composed by the distances between test sample and all classes,
D
={
d
1,
d
2, …
d
p
}.
=
argmin(
D
3
Hardware Architecture Design
3.1
Modified Distance Equation
A modified distance equation is introduced to simplify the operations of M2DPCA
eigen space projection for hardware implementation. Based on Equation (4)~(6), the
distance between input character and
i
th class on
k
th sub-block can be computed as:
Search WWH ::
Custom Search