Database Reference
In-Depth Information
nodes, like in Fig.
9.6b
,definesa
tensor-train
(TT) decomposition. It has the
following form:
a
i
1
,
...
,
i
d
¼
X
α
1
¼
1
X
t
1
t
d
g
i
11
;α
1
g
i
2
α
1
;α
2
g
i
d
1
d
1
α
d
2
;α
d
1
g
i
d α
d
1
;
1
Þ
...
Þ
:
ð
9
:
12
Þ
ð
Þ
ð
ð
Þ
ð
α
d
1
¼
1
Here we again allow the ranks
t
k
to be different for each mode. For the TT
decomposition, we call the ranks
t
k
compression ranks.
In matrix form, (
9.12
) can be represented as a product of matrices
a
i
1
,
...
,
i
d
¼ G
i
1
G
i
2
...
G
i
d
1
d
1
G
i
d
,
ð
9
:
13
Þ
where
G
i
k
is a matrix of size
t
k
1
t
k
. Note that the first matrix
G
i
1
is a row of
dimension 1
t
1
and the last matrix
G
i
d
is a column of dimension
t
d
1. Thus, the
product of the
d
matrices is a 1 x 1 matrix, i.e., a number.
For equal mode sizes,
n
k
¼ n
,
k
d
, we have
dn
matrices
G
i
k
and the number of
parameters of (
9.13
) is bounded by (
d
2)
nt
2
+2
nt
, where
t ¼
max
k
t
k
, and thus
the dependence on
d
is linear.
The following algorithm TT-SVD calculates a TT decomposition based on
truncated SVDs over all modes and can be considered as counterpart to the
truncated HOSVD algorithm 9.1.
∈
Algorithm 9.3 TT-SVD
n
, truncation rank t
n
Input: tensor
A
∈
R
Output: cores
G
1
,
,
G
d
of TT approximation
1: temporary tensor
B
:
¼ A
,
t
0
:
¼
1,
N
0
:
¼
Q
k¼
1
n
k
2: for
k ¼
1,
...
...
,
d
1
N
k
1
t
k
1
n
k
3:
calculate the dimension
N
k
¼
t
k
1
n
k
N
k
3:
unfold
B
into the dimensions
B ¼ B
∈
R
4:
compute rank-
t
k
truncated SVD
B USV
of
B
t
k
1
n
k
t
k
5:
reshape
U
such that
G
k
:¼ U
∈
R
6:
B
:
¼ SV
T
7: end for
8:
G
d
:
¼ B
Moreover, Algorithm 9.3 can be extended for automatic selection of truncation
ranks. At this, we first introduce a bound
δ
. Now in step 4 of Algorithm 9.3, we
determine the minimal rank
t
k
such that
B ¼ USV þ E
,
kk
F
δ:
ð
9
:
14
Þ
Then the following theorem holds.