Image Processing Reference
In-Depth Information
Ideally, the values of the sensor selectivity factor should be continuous within the
range [0, 1]. Every pixel corresponds to a different spatial location in the scene with
a similar but different material composition. Therefore, every pixel should have a
different value of sensor selectivity factor
. We can, however, expect these values
to exhibit some degree of spatial correlation within a given band. In the next section
we explain a solution to compute the sensor selectivity factor
which conforms to
the aforementioned constraints.
5.4 Computation of Model Parameters
Through the sensor selectivity factor
, one is primarily interested in mapping the
quality of the corresponding pixel in the input hyperspectral data. Yang et al. have
associated the sensor selectivity factor
with the ability of a given sensor to see the
objects [196]. While Kumar et al. consider
to be the gain of the sensor [95]. With
this analogy,
can assume any values in the range [0, 1]. Their computation of
based on the eigenvectors of small image blocks which makes the
-surface constant
over these image blocks. However, this
-surface can be totally discontinuous across
adjacent blocks in the given band. Also, if this technique is to be implemented over
hyperspectral data, it will prove to be computationally very demanding due to a huge
volume of the data.
We associate the value of
to the perceived quality of the pixel from a particular
band. As the primary goal of fusion is visualization, we want pixels with higher visual
quality to contribute more towards the final fused image. The conventional fusion
weight defines the contribution of the pixel relative to rest of the observations from
the entire set of input bands. In the present case, we are dealing with a single band and
the fused image using the image formation model. Although the visual or perceived
quality is best judged by the human observer, employing subjective measures is not
a viable and practical solution. One can, however, employ various objective quality
measures that can be closely related to the visual quality of the pixel. The objec-
tive measures offer a uniform and repetitive assessment of the pixel quality. Several
no-reference quality measures have been discussed in the literature. These
no-reference measures can be easily calculated from the image itself without any
reference or a ground truth. The use of these measures eliminates the dependency
on any other standard or reference data, and hence, makes the quality assessment a
stand alone process. Also, in most cases, the quality measures such as the contrast
or sharpness, are easy to calculate, and computationally efficient. In this section we
explain the use of no-reference quality measures locally at every point to compute
. While a certain single quality measure of the image can definitely be used for
this purpose, the use of only one measure may not be able to efficiently quantify the
pixel quality, and thereby the sensor selectivity factor
. Different quality measures
capture different aspects of the visual quality of the image, and thus, we need a
combination of such measures that capture complementary aspects of the pixel qual-
ity. We develop a simple, yet efficient way to compute the sensor selectivity factor
Search WWH ::

Custom Search