Image Processing Reference
In-Depth Information
the input signal. This quantity is related to the number of gray levels in the
image. An image quantized using 8-bits can accommodate 256 distinct intensity
values, i.e., gray levels. The quantization using a fewer number of bits degrades
the radiometric resolution, loses the accuracy, and hence the quality of an image.
Images quantized with a higher number of bits appear vivid as they cover a higher
dynamic range. While most commonly used images use 8-bit quantization, the
sensors that can capture 10-bit, 12-bit, and 16-bit have beenmanufactured. Several
hyperspectral sensors provide the reflectance response in the form of a 12-bit data.
For a better exploitation of this high dynamic range images, however, one requires
a device with a similar displaying capability. The real world scenes are sometimes
captured in the form of multiple observations obtained by varying the parameters
of the imaging system in order to cover as large a dynamic range as possible.
These multi-exposure observations are then blended together to provide a feel of
high dynamic range (HDR) image to the viewer. The technique of compositing
an HDR-like image from multiple observations captured from a low dynamic
range (LDR) device by varying the exposure time is an interesting research area
in computational photography. Various methodologies for compositing include
blending regions from different LDR images [67], variational formulation [145],
maximum likelihood (ML) estimation [108], and exposure fusion [113]. Since
most of the hyperspectral data already comes with a fine quantization level, this
aspect is not important for hyperspectral image visualization.
3. Sharpness : While image sharpness can be interpreted in different ways, it is a
measure of clarity of detail in an image. High frequencies in the image correspond
to the detail, and hence some definitions of sharpness are related to the modulation
transfer function (MTF) of the imaging device which represents the magnitude
of the normalized spatial frequency response [80]. The perceived sharpness of
the image is related to the spatial frequency as a function of MTF. Sometimes the
perceived sharpness is associated with the quality of edges and boundaries which
have significant high frequency components. This perceived sharpness is affected
by factors such as spatial resolution and acutance [98]. As the spatial resolution
describes the ability of discrimination of finer objects, images with higher spatial
resolution can provide a better delineation of an object due to reduced aliasing.
Acutance describes howquickly image information changes at an edge, and thus, a
high acutance results in sharp transitions and detail with well-defined boundaries.
Acutance can be calculated from the mean square density gradient across the
edge [3].
Sharpening enhances the perceived details by creating more pronounced edges
and boundaries in the image. The unsharp masking has been one of the most pop-
ular techniques of image sharpening, especially used in printing industry. The
low pass filtered version of an image is compared with the original for selec-
tive enhancement of the detail components [110]. The performance of unsharp
masking can be improved using adaptive filters [141], or non-linear filters [148].
Software packages such as Adobe ® Photoshop ® and
also provide sharpen-
ing facilities. However, over-sharpening produces visible edge artifacts, known
as halos, and makes an image appear granular.
Search WWH ::

Custom Search