Database Reference
In-Depth Information
take the natural logarithm of the above expression to replace the products by sums. We also
get simplification from the fact that log(
e
x
) =
x
.
(10.1)
to use gradient descent in a manner similar to what was done in
Section 9.4.5
. That is, we
pick one node
x
, and adjust all the values of the
F
x
C
in the direction that most improves the
the
F
x
C
are those where one of
u
and
v
is
x
and the other of
u
and
v
is a node adjacent to
x
.
Since the degree of a node is typically much less than the number of edges in the graph, we
can avoid looking at most of the terms in (
10.1
) at each step.
10.5.5
Exercises for Section 10.5
EXERCISE
10.5.1 Suppose graphs are generated by picking a probability
p
and choosing
each edge independently with probability
p
, as in
Example 10.21
.
For the graph of
Fig.
probability this graph is generated?
EXERCISE
10.5.2 Compute the MLE for the graph in
Example 10.22
for the following
guesses of the memberships of the two communities.
(a)
C
= {
w, x
};
C
= {
y, z
}.
(b)
C
= {
w, x, y, z
};
C
= {
x, y, z
}.
EXERCISE
10.5.3 Suppose we have a coin, which may not be a fair coin, and we flip it some
number of times, seeing
h
heads and
t
tails.
(a) If the probability
p
of getting a head on any flip is
p
, what is the MLE for
p
, in terms
of
h
and
t
?
!
(b) Suppose we are told that there is a 90% probability that the coin is fair (i.e.,
p
=
0
.
5), and a 10% chance that
p
= 0
.
1. For what values of
h
and
t
is it more likely that
the coin is fair?
!!
(c) Suppose the a-priori likelihood that
p
has a particular value is proportional to |
p
−
0
.
5|. That is,
p
is more likely to be near 0 or 1, than around 1/2. If we see
h
heads
and
t
tails, what is the maximum likelihood estimate of
p
?