Database Reference
In-Depth Information
s , s 0 , a∈S and obtain the approximate transition
probabilities, which we may use instead of the original P , hoping that the former
turn out to be more stable.
the transition probabilities P ¼ p ss 0
Numerous further applications of tensor factorization to recommendation
engines are conceivable.
9.2 More Tensor Factorizations
Despite its power, the HOSVD is quite complex. Hence, the question arises
of whether there are simpler and less computationally intensive decompositions.
Indeed, this is possible in many cases and we shall address some important factor-
izations in the following.
9.2.1 CANDECOMP/PARAFAC
If we simplify the core tensor C of a Tucker tensor to a diagonal tensor with
1,
i 1 ¼ ...¼ i d ,
c i ¼
,
0,
else
we obtain the canonical decomposition (CANDECOMP) instead of ( 9.1 ). It is also
known as Parallel Factor Analysis (PARAFAC), so we call it CANDECOMP/
PARAFAC (CP).
Definition 9.5 An n-dimensional d-mode CP-tensor of (canonical) rank-t
is
a tensor
A ¼ U 1
1 ... d U d ,
ð 9
:
6 Þ
where U k
n k ;ðÞ
R
k
d are the mode factors.
,
Remark 9.2 A CP-tensor is completely determined by its mode factors. For nota-
tional reason that should become clear in the following, we shall henceforth place
the mode index in an upper right position, i.e., U k as opposed to U k .
By decomposing the mode factors along the rank index into vectors
u j ¼
U i , j
, we may write the CP-tensor as follows:
i∈ n k
¼ X
t
A ¼ U 1
1 ... d U d
u j ...
u j
:
ð 9
:
7 Þ
1
Search WWH ::




Custom Search