Biomedical Engineering Reference
In-Depth Information
forth. When the images are out of alignment, however, we will have dupli-
cate versions of these structures from A and B .
Using this concept, registration can be thought of as reducing the amount of
information in the combined image, which suggests the use of a measure of
information as a registration metric. The most commonly used measure of infor-
mation in signal and image processing is the Shannon-Wiener entropy mea-
sure H , originally developed as part of communication theory in the 1940s. 41, 42
p i
log
p i
H
(3.20)
i
H is the average information supplied by a set of i symbols whose probabili-
ties are given by p 1 , p 2 , p 3 ,…, p i .
This formula, except for a multiplicative constant, is derived from three
conditions that a measure of uncertainty in a communication channel should
satisfy. These are
1.
The functional should be continuous in p i ;
1
--- ,
2.
If all p i equal where n is the number of symbols, then H should
be monotonically increasing in n ; and
3.
If a choice is broken down into a sequence of choices, then the
original value of H should be the weighted sum of the constituent
H . That is H ( p 1 , p 2 , p 3 )
p 2
p 3
.
H ( p 1 , p 2
p 3 )
( p 2
p 3 ) H
------------------
------------------
p 2
p 3
p 2
p 3
Shannon proved that the
p i log p i form was the only functional form satis-
fying all three conditions.
Entropy will have a maximum value if all symbols have equal probabil-
ity of occurring (i.e., p n
1
---
i ), and have a minimum value of zero if the
probability of one symbol occurring is one, and the probability of all the
others occurring is zero.
Any change in the data that tends to equalize the probabilities of the
symbols p 1 , p 2 , p 3 ,…, p i (i.e., that makes the histogram more uniform)
increases the entropy. Blurring the data reduces noise, and so sharpens the
histogram and results in reduced entropy. Registration algorithms often
iteratively transform images, and the interpolation algorithms used for
these transformations blur the data (as described more fully in Section 3.5).
The consequences of interpolation-induced entropy changes need to be
carefully considered.
3.4.8.1 Joint Entropy
In image registration we have two images, A and B , to align. We therefore
have two symbols at each voxel location for any estimate of the transforma-
tion
. Joint entropy measures the amount of information we have in the com-
bined images. 41 If A and B are totally unrelated, then the joint entropy will be
T
Search WWH ::




Custom Search