Biomedical Engineering Reference
In-Depth Information
Using Equation 3.26, we can rewrite the equation for mutual information as:
I
(
A
,
B
)
H
(
A)
H
(
B
A
)
H
(
B
)
H
(
A
B
)
(3.27)
The conditional entropy term in Equation 3.27 will be zero if knowing the
intensity
) enables us to perfectly predict the corresponding intensity
value in Registration by maximization of mutual information, therefore,
involves finding the transformation that makes image
A
(
x
A
B T .
A
the best possible
B T ,
predictor for image within the region of overlap.
Knowing the value of a voxel in image
reduces the uncertainty (and
hence entropy) for the value of the corresponding location in image
A
when
the images of the same object are correctly aligned. This can be thought of
as a generalization of the assumption made by Woods in his PIU measure.
The PIU measure assumes that, at registration, the uniformity of values in
B
B
should be maximum. The inform-
ation theoretic approaches assume that, at alignment, the value of a voxel
in
corresponding to a given value
a
in
A
. As
misregistration increases, one image becomes a less good predictor of the
second.
A
is a good predictor of the value at the corresponding location in
B
3.4.8.3
Normalized Mutual Information
Mutual information does not entirely solve the overlap problem described
above. In particular, changes in overlap of very low intensity regions of the
image (especially noise around the patient) can disproportionately contri-
bute to the mutual information. Alternative normalizations of joint entropy
have been proposed to overcome this problem.
Three normalization schemes have so far been proposed in journal arti-
cles. Equations 3.28 and 3.29 were mentioned in passing in the discussion
section of Maes et al.
45
2 IA , B
(
)
˜ 1 A , B
------------------------------------
(
)
(3.28)
HA
()
HB
()
˜ 2 A , B
(
)
HA , B
(
)
IA , B
(
)
(3.29)
Studholme has proposed an alternative normalization devised to overcome
the sensitivity of mutual information to change in image overlap.
47
HA
()
HA , B
HB
()
˜ 3 A , B
------------------------------------
(
)
(3.30)
(
)
The third version of normalized mutual information has been shown to be consi-
derably more robust than standard mutual information.
47
Furthermore, it can
Search WWH ::




Custom Search