Image Processing Reference
In-Depth Information
an image band has the highest correlation with the adjacent band. For a spectrally
ordered dataset, the conditional entropy (the measure of additional information)
monotonically increases with the spectral distance.
In this case, while evaluating a band I r for its possible selection for fusion using
the aforementioned scheme, the minimum value of the conditional entropy is found
to be in the band from the subset I whose spectral distance is the least from the band
I r .If
(
p
1
)
image bands have already been selected for fusion from the first
(
r
1
)
input image bands (where p
r ), then the most recent
(
p
1
)
-th band has the least
spectral distance from the remaining
bands in I .
Thus, instead of evaluating the conditional information of band I r against all
(
K
r
)
bands of I as given in Eq. ( 4.2 ), we need to compare the value for the
the
(
p
1
)
I r | I k ) =
(
p
1
)
-th band only. When min k H
(
H
(
I r |
I k )
,Eq.( 4.2 ) reduces to the
special case as proposed in [87].
H
k
=
(
I r |
I k ) Θ,
>
.
p
argmin
r
r
(4.4)
The calculation of threshold
remains the same as given in Eq. ( 4.3 ).
The basic scheme of the entropy-based band selection process described in the
previous subsection involves computation of conditional entropy of every band given
p
Θ
,..., K selected bands, where the number of selected bands p ,is
monotonically increasing. On the other hand, in this special case of band selec-
tion scheme, the conditional entropy of every band is calculated only once, i.e., with
respect to the most recently selected band for fusion. Therefore, the band selection
process turns out to be computationally very efficient, and is very suitable for fast
visualization of hyperspectral data.
,
p
=
1
,
2
4.3.1 Computational Savings
In the special case of spectrally ordered bands, a band is selected if the entropy of the
band conditioned on themost recently selected band exceeds a threshold. The number
of bands being selected and the corresponding average computational requirements
depend on the nature of function representing the conditional information H
I k )
of the image bands. When the band under evaluation is exactly the same as the
band chosen, i.e., I r
(
I r |
I k , then the amount of additional information possessed
by I r is zero. On the contrary, when the bands I r and I k are totally uncorrelated
from each other, then the conditional entropy of the band I r equals its entropy, i.e.,
H
=
. We analyze the savings in computational requirements on the
basis of appropriately modeling the conditional information.
Generally, the correlation between image bands decreases exponentially as the
spectral distance between the corresponding bands increases, when we may use the
following theorem to compute savings in the special case of band selection scheme.
(
I r |
I k ) =
H
(
I r )
 
Search WWH ::




Custom Search