Information Technology Reference
In-Depth Information
1f t
t ( n )
0 otherwise
=
t ( n ) ) =
ʔ 0 / 1 (
t
,
(3.8)
The constrained optimization problem in Eq. ( 3.7 ) can be equivalently written as an
unconstrained problem:
N
1
2
2
min
w
L
(
w
) =
w
+
C 1
R n (
w
)
(3.9)
i
=
1
t ( n ) ) +
w T
x ( n ) ,
w T
x ( n ) ,
t ( n ) )
ʦ(
ʦ(
R n (
w
) =
max
t
ʔ 0 / 1 (
t
,
max
s
s
,
t
)
max
s
s
,
We use the non-convex bundle optimization in [ 10 ]tosolveEq.( 3.9 ). In a nutshell,
the algorithm iteratively builds an increasingly accurate piecewise quadratic approx-
imation of L
(
w
)
based on its sub-gradient
w L
(
w
)
. The key issue is to compute the
subgradients
w L
(
w
)
. We define:
s ( n )
w T
x ( n ) ,
ʦ(
=
argmax
s
s
,
t
),
n
,
t
T ,
s ( n ) =
w T
x ( n ) ,
t ( n ) ),
argmax
s
ʦ(
s
,
n
(3.10)
ʔ 0 / 1 (
)
t ( n )
t ( n ) ) +
w T
x ( n ) ,
ʦ(
=
argmax
t
t
,
max
s
s
,
t
where
w L
(
w
)
can be further computed as:
N
N
s ( n )
t ( n )
x ( n ) ,
x ( n ) ,
s ( n ) ,
t ( n ) )
w L
(
) =
+
1 ʦ(
,
)
1 ʦ(
w
w
C 1
C 1
(3.11)
i
=
n
=
Using the subgradients
, we can optimize Eq. ( 3.7 ) using the algorithm in [ 10 ]
and output the optimal model parameter w .
At each optimization iteration, we also infer the latent attribute variables s :
w L
(
w
)
s =
w T
x ( n ) ,
t ( n ) ),
ʦ(
argmax
s
s
,
(3.12)
This is a standard max-inference problem in an undirected graphical model and we
use loopy belief propagation [ 23 ] to approximately solve it.
3.5 Applications of Relational User Attribute Inference
Our proposed model can be applied to various applications. We introduce two of the
potential application scenarios in the following sub-sections.
 
Search WWH ::




Custom Search