Database Reference
In-Depth Information
Γ
˼
ʻ
ʼ
Size
= 0.5*size
h(n)
Size
= 0.5*size
h(n)
Γ
˃ʿ
ʻʼ
˼
̋
̋
˼
Size
= 0.5*size
g(n)
Γ
˄ʿ
ʻʼ
˼
̋
Size
= 0.5*size
g(n)
Fig. 8.7.
Procedure of recursive wavelet decomposition.
8.3 A Cascaded Architecture of a Neural Fuzzy
Network with Feature Mapping (CNFM)
After we obtain the features from the spatial, statistical, and spectral domains, we
shall proceed to train the CNFM. It is composed of two cascaded neural networks.
The former is the unsupervised Kohonen's SOM, and the latter is the supervised
neural fuzzy network (called SONFIN). This complementary architecture can
compensate for the problems of inaccuracy and long training time of unsupervised
and supervised neural networks, respectively. In more detail, the most important
contribution of this architecture is to improve the method of input selection.
Compared to the conventional trial-and-error method, the input dimension of our
system is first reduced by Kohonen's SOM, and then each group of features from
a channel is transformed into 2D coordinates. Next, we use SONFIN to overcome
the inaccuracy due to Kohonen's SOM. Hence, no matter how many features and
how many channels we use, the problems of big input space and long training time
can be removed by the proposed mechanism.
8.3.1 Reduction of Input Dimension by Unsupervised Network
This subsection introduces Kohonen's self-organizing map (SOM), which can map
high-dimensional inputs into a 2D map and filter out some noisy information. We
shall apply conscience to Kohonen's SOM to normally distribute the input clusters.
With the benefits of Kohonen's SOM, the system can easily be adapted to the
changes (increase) in both features and channels. Also, it can filter out some noisy
information. The most important merit is that we can avoid trial and error to re-
move redundancy, such as by genetic algorithm, KL expansion, and correlation
ʻ̋
˽
˧
̌ʼʳ˂ʳˮʻ̋
˽
˧
̋ʼʻ̌
˽
˧
̌ʼ˰ˁ
Search WWH ::
Custom Search