Biomedical Engineering Reference
In-Depth Information
The end result of the matching process is a list of groups of likely three-
points correspondences that satisfies the geometric consistency constraint. The
list is sorted such that correspondences that are far apart are at the top of the
list. A rigid transformation is calculated for each group of correspondences and
a verification stage [9] is performed to obtain the best group. Detailed discus-
sion concerning the surface signature sensitivity and robustness can be found
in [26].
1.5.2
Maximization of Mutual Information
(MI) Algorithm
MI is a basic concept from information theory, measuring the statistical depen-
dence between two random variables or the amount of information that one
variable contains about the other. The MI registration criterion used states that
the MI of corresponding voxel pairs is maximal if the two volumes are geometri-
cally aligned [31]. No assumptions are made regarding the nature of the relation
between the image intensities in either modality.
Consider the two medical volumes to be registered as the reference volume
R and the floating volume F . A voxel of the reference volume is denoted R ( x ),
where x is the coordinates vector of the voxel. A voxel of the floating volume
is denoted similarly as F ( x ). Given that T is a transformation matrix from the
coordinate space of the reference volume to the floating volume, F ( T ( x )) is
the floating volume voxel associated with the reference volume voxel R ( x ).
MI seeks an estimate of the transformation matrix that registers the reference
volume R and floating volume F by maximizing their mutual information. The
vector x is treated as a random variable over coordinate locations in the refer-
ence volume. Mutual information is defined in terms of entropy in the following
way [25]:
I ( R ( x ) , F ( T ( x ))) h ( R ( x )) + h ( F ( T ( x ))) h ( R ( x ) , F ( T ( x ))) .
(1.19)
where h ( R ( x )) and h ( F ( T ( x ))) are the entropy of R and F , respectively.
h ( R ( x ) , F ( T ( x ))) is the joint entropy. Entropy can be interpreted as a mea-
sure of uncertainty, variability, or complexity. The mutual information defined
in Eq. (1.19) has three components. The first term on the right is the entropy in
the reference volume, and is not a function of T . The second term is the entropy
of the part of the floating volume into which the reference volume projects. It
Search WWH ::




Custom Search