Databases Reference
In-Depth Information
n (x, y) = xy + (x−2) y + (x−4) y + . . . + y + (y −2) + (y −4) + . . . + 3 + 1
0
@ ( x− 2 )
1
A
( y− 2 )
n (x, y) =
(x−2i)
y −y
(y −2i)
i=0
i=0
2
2
x + 1
2
y + 1
2
n (x, y) =
−1
y +
(4.2)
However, the equation above does not take into account the number of
different pattern elements, v. The effect of v on n (x, y) is introduced in the
following equation:
2
2
x + 1
2
y + 1
2
n (x, y) = v
−1
y +
(4.3)
To illustrate the effect that higher-dimensional patterns have on the number
of neurons required in an HGN implementation, the numbers of neutrons for
one- and two-dimensional HGN compositions are plotted as a function of the
binary pattern size in Figure 4.4. The two-dimensional composition is limited
to patterns with quadratic-value sizes, i.e., x = y.
The graph shows that a two-dimensional composition requires significantly
fewer neurons than a one-dimensional structure. However, the complexity of
the HGN algorithm for higher-dimensional structures is not guaranteed to be
equivalent to the one-dimensional composition. Furthermore, for large-scale
patterns, the size of the network might be very large. In a high-dimensional
representation, the collective size of the bias array might be significant. More
discussions on this aspect will be presented in later sections.
4.1.3 HGN Recognition Procedure
There are a number of stages in the HGN pattern recognition procedure,
including recognition at every layer within the hierarchical structure. The
communication paths within the HGN layers are similar to the simple GN
implementation. The HGN communications propagate from the base layer
neurons to the top neuron, and consequently, from the top neuron to the base
layer neurons.
The HGN communications procedure is as follows. Each neuron in the base
layer receives an input pattern from an external entity, which we refer to as
the Stimulator and Interpreter (SI) module after Nasution and Khan [3]. Each
neuron that receives an input is called an active neuron. Each active neuron
in the base layer acknowledges that it is active by sending its p(column, row)
Search WWH ::




Custom Search