Biomedical Engineering Reference
In-Depth Information
considered to be due to a positron emission or due to a random decay
process.
Some of these factors are due to the limitations of current instrumentation
(like crystal quality, timing and energy calibration). The influence of these
factors cannot be predicted. Other factors are related to the detector geom-
etry. These can be predicted on the basis of calculations of simulations. In
either case, a correction for these effects needs to be carried out. This is the
normalization correction. Generally speaking, two approaches can be taken.
In direct normalization all lines of response do see an activity distribution,
and data are acquired (at low count rates to minimize the effects of randoms
and scatters) for a suciently long time to allow for some statistical certainty
on the normalization factors, which will be equal to the reciprocal of the num-
ber of acquired counts. Assuming Poisson statistics, an error of 10% would
require at least 100 counts in every sinogram bin (normalization factors are
placed in a sinogram and multiplied with the measured trues sinogram). For
a modern scanner with a large number of crystals this can imply a prohibitive
amount of time needed for normalization. Therefore, component-based nor-
malization schemes have become popular in which as many components of
the normalization as possible are precomputed (i.e., geometrical factors). The
crystal eciencies are estimated from a scan on, e.g., a cylindrical phantom,
where a particular crystal is acquiring data in coincide with a sum of oppo-
site detectors. This increases count statistics, and the assumption is that the
total number of coincidences measured is still a good measure for the crystal
eciency.
6.1.3 Noise equivalent count rates
The concept of noise equivalent count rates addresses the fact that at
increased count rates the number of randoms and scatter events increases as
well. In the NEMA recommendations for the performance evaluation of PET
scanners the determination of NEC curves is therefore prescribed. In addition,
the dead time characteristics of the system need to be taken into account.
6.1.4 System dead time
The term dead time refers to the finite time a general pulse processing
system needs in order to process an event [10, 11]. During this processing
time the system cannot process a following incoming event. There are two
ways in which the system can cope with this second (or higher order) event
(see Figure 6.2):
the system neglects the event, and after processing the first event the
next-coming event will be processed. Such a system is called non-
paralyzable. At increasing count rates the system will at first linearly
increase the number of processed events. If the count rate becomes of
 
Search WWH ::




Custom Search