Image Processing Reference
In-Depth Information
cube, corresponding to the wavelengths from 1693.765 to 2582.273nm. Fig. 3.4 a-c
show these fused images of the Palo Alto region collected by the Hyperion imag-
ing sensor after the second stage. We obtain the final grayscale fused image of this
urban dataset by fusing these three pre-final stage images, which has a bandwidth of
2231.768nm and is shown in Fig. 3.4 d.
Let us consider another hyperspectral dataset provided by the AVIRIS imaging
sensor developed by the Jet Propulsion Laboratory (JPL/NASA). This is an airborne
sensor where the data get collected as the aircraft flies over the corresponding region
of the earth. This dataset consisting of 224 bands depicts the region of the Moffett
Field in the CA. Let us refer to this dataset as the moffett 2 dataset. This dataset has
the dimension of (614
224) where each pixel is represented by 16 bits.
Similar to the previous example, we have employed a three-stage hierarchical fusion
scheme for the fusion of the moffett 2 dataset. Each of the resultant images of the
first stage, thus, represents a combined response of the scene over the bandwidth
of nearly 120nm where each of these images are contiguous over the wavelength
spectra. These 18 first-stage fused images are evenly grouped, and fused using the
same fusion methodology to form three second-stage fused images. The first of these
images, which is actually a result of fusion of the first 78 bands of the AVIRIS data
×
512
×
Fig. 3.5 Results of the second-stage fusion of the moffett 2 image cube from the AVIRIS, and the
final fused image. a Fusion over bands 1-78. b Fusion over bands 79-152. c Fusion over bands
153-224. d Final fused image. (©2010 IEEE, Ref: [88])
 
Search WWH ::




Custom Search