Biomedical Engineering Reference
In-Depth Information
The algorithm ART (
Algebraic Reconstruction Technique
) was proposed in 1970
[
18
] and is one of the simplest iterative methods of reconstruction. In this method
the estimates under a given line are compared with the measured projection and
corrected using a simple subtraction. The process follows projection after projection
iterating a certain number of times. With a suitable choice of the over-relaxation
parameters, it has been shown that this algorithm is capable of producing high
quality images [
19
].
In a Bayesian approach, the reconstruction (
14
) is seen as an optimization
problem where the values
X
are the ones that better explain the observable data,
meaning that the reconstruction algorithm is guided to determine the most likely
values of the image,
X
, given its projections,
P
, or to maximizing the conditional
probability,
PŒXjP
, which is the probability of occurring
X
given
P
[
10
].
According to Bayes' formula, we can rewrite the conditional probability as follows:
PŒP
j
X P ŒX
PŒP
PŒXjP D
:
(15)
As the denominator in (
15
) is constant, maximizing
PŒXjP
is equivalent to
maximizing only the numerator. The term
PŒPjX
is called likelihood and signifies
how close the data and the image are or, again, the probability of
P
given
X
.Inthe
reconstruction by maximum likelihood the probability
PŒPjX
is maximized.
In the case a Poisson model is considered for the emission, the conditional
probability is given by
P
a
ij
f
j
p
i
p
i
Š
e
P
a
ij
f
j
Y
PŒPjX D
:
(16)
i
Applying logarithms to (
16
) and maximizing it:
n
p
i
X
a
ij
f
j
ln
.p
i
Š/
o
:
X
X
a
ij
f
j
ln
.P ŒP jX/ D
(17)
i
Although there are several methods to maximize the likelihood (
16
)themost
used is expectation maximization [
20
]. This method (ML-EM) involves an iterative
technique whose convergence is guaranteed [
21
]. The method follows the general
procedure explained above (Fig.
10
) and can be summarized by the following
equation:
f
.n/
j
X
p
i
f
.n
C
1/
j
D
P
a
ij
:
(18)
P
a
`j
a
ik
f
.n/
k
i
`
k
One of the most important variants of the ML-EM algorithm is the OSEM
algorithm (Ordered Subset Expectation Maximization) [
22
], which converges faster
than its predecessor, however, there is no proof that OSEM converges to the same