Image Processing Reference
In-Depth Information
pixels each assuming
N
gray levels
g
,
g
,
,
g
. The mutual information
MI
1
2
N
between Q and K can be defined as:
MI
(
, K QK
)
=
H
(
)
+
H
(
)
H
(
, K
)
(7.8)
( . ) is the entropy of an image.
where
H
H
(Q) can be written as:
N
1
H
(
Q
)
=−
log[
P
(
Q
=
g
)]
P
(
Q
=
g
)
(7.9)
i
i
i
=
P
(Q =
g
) means the probability that a pixel in
Q
image will assume the
i
value
.
So, the image entropy can be written in terms of the image histogram
g
i
His Q
:
N
His
()
i
1
Q
H
()
Q
=−
log
His
()
i
(7.10)
Q
M
M
i
=
1
Also, the joint entropy of two images Q and K with the same number of
pixels
M
and the same gray-level range
N
can be written in terms of joint image
histogram
His QK
:
N
N
His
(, )
i j
1
1
QK
H
(
QK
,
)
=−
l g
His
QK (, )
ij
(7.11)
M
2
M
2
i
=
1
j
=
His
QK (, )
i j
is equal to the number of simultaneous occurrences of Q =
i
and
K =
.
The MI registration criterion states that the MI of the image intensity values
of corresponding voxel pairs is maximal if the images are geometrically aligned.
Because no assumption is made about the nature of the relation between the
image intensities, this criterion is very general and powerful. MI has been shown
to be robust for both multimodal and unimodal registration, and does not depend
on the specific dynamic range or intensity scaling of the images. The MI as
previously defined is not a negative number. Because many optimization algo-
rithms are formulated as minimization algorithms, the negative of
j
MI
(
MI
) is
often used as a similarity metric.
7.3.2
P
E
HANTOM
XPERIMENTS
In order to explain the differences among similarity metrics, an experiment was
performed using synthetic images. A 3-D phantom was realized, constituted by
two coaxial elliptical cylinders. Three regions were defined with three different
signal levels, as depicted in Figure 7.2A . Gaussian noise was added to the phantom.
Search WWH ::




Custom Search