Databases Reference
In-Depth Information
FIGURE 4.2: HGN composition for two- and three-dimensional patterns of
size 49 and 147, respectively.
GNs in the composition. Higher dimensional structures have a significantly
smaller network size.
As discussed in the previous chapter, pattern representation in a GN net-
work applies the graph-based (value, position) structure. The HGN imple-
mentation follows a similar approach. In addition to (value, position), the
HGN requires the size of the patterns. Patterns used in the HGN recognition
scheme must have an odd-size length. This requirement caters to the hierar-
chical structure of the HGN network and results in one top neuron which sees
the overall pattern structure. Patterns with an even-size length must add a
“dummy” value at the end of the pattern.
4.1.1 Solution to Crosstalk Problem
The main limitation of the Graph Neuron implementation, the intersec-
tion or crosstalk issue, is attributed to its inability to see the entire pattern
structure. This limitation has been overcome by the hierarchical GN network
layout of the HGN. In this subsection, further analysis of this solution will be
presented.
A one-dimensional HGN network for patterns of size 5 bits, shown in Figure
4.3, will be considered. For six different pattern elements, the number of GNs
required for this composition is 6 × ((5 + 1) ÷ 2) 2 = 54.
When the pattern “uvwxz” is introduced into the HGN network, each GN
that has a (value, position) that matches an element in the pattern will be
activated. Therefore, GNs U 1, V 2, W 3, X 4, and Z 5 will be activated. Once
activated, each base layer GN executes a recognition process by exchanging
Search WWH ::




Custom Search