Image Processing Reference

In-Depth Information

Appendix E: Resolution and

Degrees of Freedom

We gain some additional understanding of the inverse scattering problem

and its challenges by considering the scattered field measurement process,

regarded as Fourier data, as a truncated sampling process. For a finite object

of width
D
the Whittaker-Shannon sampling theorem demands a sampling

rate of at least
B
min
= 1/
D
in the frequency spectrum. The representation of the

k
-space spectrum is only complete, however, if an infinite number of samples

are available covering the entire
k
-space. Since our
k
-space volume is physi-

cally limited to include only propagating waves, it is necessarily a truncated

set of samples.

The image estimate based on a truncated set of
k
-space samples provides

only a low spatial frequency estimate of the signal. For a spectrum sampled

at the Nyquist rate determined by the object support, we can easily verify that

the
P
-matrix of the PDFTs is diagonal. This means that at this sampling rate

the PDFT will not improve the image estimate beyond the classical limit, and

the model of the signal is already represented optimally in a least square sense

by the available truncated sampling expansion, that is, by a discrete Fourier

series.

It is worth contemplating why Fourier data sampled at the Nyquist rate

precludes any hope of bandwidth extrapolation. The sampling expansion is

constructed to obtain orthogonal interpolation functions. In other words, the

zeros of the interpolating
sinc
function perfectly coincide with the location

of sampling points. Thus, the data are assumed to be independent and do

not contain any information related to other sampling points. Points on the

sampling grid of the spectrum outside the window of measured data do not

contribute to the points inside this window. In turn, the latter cannot be used

to estimate points on the sampling grid outside the data window. It is well

known that spectral data sampled at the Nyquist rate do not contain sufficient

information for bandwidth extrapolation but that a higher sampling rate is

required. This is precisely the context to which the PDFT algorithm is appli-

cable. For oversampled data, the samples are no longer independent, and the

coefficients of the PDFT reconstruction must be selected to ensure data con-

sistency as a result of convolution with the interpolating function. This inter-

dependency has two consequences. First, it provides the freedom to balance

the PDFT coefficients to obtain improved signal resolution (and bandwidth

extrapolation). Second, the image reconstruction from interdependent sam-

ples results in high susceptibility to noise in the measured data. In particular,

the improved image resolution is the result of a delicate interference between

different interpolating functions, all of which carry the main portion of their

energy outside the data window. Thus, even small errors inside the window

are amplified in the extrapolated region. It has been shown that this confines

213

Search WWH ::

Custom Search