Biomedical Engineering Reference
In-Depth Information
3.4.8.2
Mutual Information
A solution to the overlap problem from which joint entropy suffers is to con-
sider the information contributed to the overlapping volume by each image
registered as well with the joint information. The information contributed by
the images is simply the entropy of the portion of the image that overlaps
with the other image volume:
p T
p T
HA
()
()
a
log
()
a
(3.23)
a
p T b
p T b
()
log
()
HB
()
(3.24)
b
p T
p T
where and are the marginal probability distributions, which can be
thought of as the projection of the joint PDF onto the axes corresponding to
intensities in images
, respectively. It is important to remember that
the marginal entropies are not constant during the registration process.
Although the information content of the images being registered is constant,
the information content of the portion of each image that overlaps with the
other image will change with each change in estimated registration transfor-
mation
A
and
B
B T ,
is transformed to
which involves interpolation, further altering the probabilities. The super-
scripts on the formulae for the marginal probability distributions reflect this
dependence of the probability distribution on
T
. Furthermore, with each iteration, image
B
T
(i.e., the change in overlap)
for image
, which is resampled at each iteration.
Communication theory provides a technique for measuring the joint
entropy with respect to the marginal entropies. This measure, introduced
as “rate of transmission of information” by Shannon in his 1948 paper that
founded information theory,
A
and on
for image
B
T
41
has become known as mutual information
I
), and was independently and simultaneously proposed for intermo-
dality medical image registration by researchers in Leuven, Belgium,
(
A
,
B
44,45
and
30,46
at Massachusetts Institute of Technology in the U.S.
T
p AB
(
a , b
)
T
p AB
(
a , b
)
log
----------------------------------
IA , B
(
)
HA
()
HB
()
HA , B
(
)
p T
p T b
()
a
()
a
b
(3.25)
Mutual information can qualitatively be thought of as a measure of how well
one image explains the other, and is maximized at the optimal alignment. We
can make our description more rigorous if we think more about probabilities.
The conditional probability
p
(
b
|
a
) is the probability that
B
will take the value
b
given that
A
has the value
a
. The conditional entropy is therefore the average
of the entropy of
B
for each value of
A
, weighted according to the probability
of getting that value of
A
.
T
p T ba
p AB
(
a , b
)
log
(
)
HA , B
(
)
HA
()
(3.26)
HBA
(
)
a , b
Search WWH ::




Custom Search