Information Technology Reference
In-Depth Information
Here, rank(
Y
) is a given value
r
and the minimization is over
Y
and the quantifications
z
, normalized as above. When
z
is known,
Y
is given by the usual
r
-dimensional Eckart-
Young PCA solution (see Chapter 2) to approximating
H
.When
Y
is known, the criterion
(8.16) may be written
p
2
min
1
G
k
z
k
−
y
k
(8.17)
k
=
so, writing
y
k
for the
k
th column of
Y
, we may find the quantifications independently
for each variable by solving
2
,
min
G
k
z
k
−
y
k
(8.18)
subject to the constraints on
z
k
. We may arrange that
Y
always has zero column sums
so only the constraint
z
k
G
k
G
k
z
k
=
z
k
L
k
z
k
=
1 needs attention. This minimization is a
constrained regression problem. To find the solution to (8.18) we introduce the Lagrange
multiplier
λ
and consider
z
k
G
k
G
k
z
k
−
2
z
k
G
k
y
k
+
y
k
y
k
+
λ(
z
k
L
k
z
k
−
1
).
(8.19)
Taking the derivatives of (8.19) with respect to
λ
and to
z
k
and setting to zero, we obtain
G
k
G
k
z
k
−
G
k
y
k
+
λ
L
k
z
k
=
0
.
(8.20)
From (8.20), noticing that
z
k
G
k
G
k
z
k
=
z
k
L
k
z
k
λ
=
z
k
G
k
y
k
−
1.
=
1, it follows that
Therefore,
G
k
G
k
z
k
−
G
k
y
k
+
(
z
k
G
k
y
k
−
1
)
L
k
z
k
=
0
that is,
z
k
−
L
−
1
G
k
y
k
+
(
z
k
G
k
y
k
−
1
)
z
k
=
0
k
or
1
z
k
G
k
y
k
L
−
1
G
k
y
k
.
z
k
=
k
But
z
k
L
k
z
k
=
1 implies that
y
k
G
k
L
−
1
z
k
G
k
y
k
G
k
y
k
,
=±
k
so that a solution to (8.18) is given by
L
−
1
k
G
k
y
k
z
k
=
y
k
G
k
L
−
1
G
k
y
k
.
(8.21)
k
L
−
1
k
It is easy to check that
z
k
L
k
z
k
1and
1
L
k
z
k
1
y
k
=
=
=
0. Note that
z
k
=
G
k
y
k
merely estimates the quantifications by the average values of
y
i
obtained for the