Information Technology Reference
In-Depth Information
Therefore, by combining Eqs. (3.21) and (3.22), it follows that:
T
G × C × A
= D .
(3.23)
We show that regressor coefficients of
can be obtained by a least square estimation
deduced by Eq. (3.23), by using the direct product
C
between matrices, also called
the Kronecker product, a special case of tensor product used in linear algebra and
in mathematical physics.
Given two real matrices A
,
B of dimension n
×
m and t
×
d respectively, the direct
product :
A
B
is the matrix of dimension nt
×
md , constituted by nm blocks
(
A
B
) i , j , such that,
if A
=(
a i , j |
1
i
n
,
1
j
m
)
,then
(
A
B
) i , j =
a i , j B (all the elements of B
are multiplied by a i , j ). If we use the block notation, A
B can be represented in the
following way:
.
a 1 , 1 Ba 1 , 11 B
...
a 1 , m B
a 2 , 1 Ba 2 , 2 B
a 2 , mn B
... ... ... ...
a n , 1 Ba n , 2 B
...
(3.24)
...
a n , m B
The Kronecker product is bilinear and associative, that is, it satisfies the following
equations:
A
(
B
+
C
)=(
A
B
)+(
A
C
)
(
A
+
B
)
C
=(
A
B
)+(
A
C
)
(
kA
)
B
=
A
(
kB
)=
k
(
A
B
)
(
A
B
)
C
=
A
(
B
C
) .
Moreover, matrix direct product verifies the following equations (the last equation
when matrices are invertible):
(
A
B
) × (
C
D
)=(
A
×
C
) (
B
×
D
)
T
A T
B T
(
A
B
)
=
) 1
A 1
B 1
(
A
B
=
.
The following Lemma asserts a useful property of the Kronecker product [87].
Lemma 3.2 (Vectorization Lemma). Let us denote by vec
(
W
)
the vectorization of
amatrixW
,
obtained by concatenating all the columns of W
,
in their order, in a
unique column vector. Then the following equation holds:
B T
vec
(
A
×
X
×
B
)=(
A
) ×
vec
(
X
) .
(3.25)
 
Search WWH ::




Custom Search