Databases Reference
In-Depth Information
m
s sp
n sn =
, s sp ≤ m
(5.2)
The neurons within a DHGN subnet are structured in hierarchical manner,
similar to the HGN. Each layer within the DHGN subnet is populated with
neurons. The number of layers, l gn , required within a DHGN subnet is given
by the following equation:
l gn = s sp + 1
2
(5.3)
Note that the number of neuron layers could be directly determined from
the calculation of size of the network as shown in Equation 5.1. The conditions
for GN node generation within a particular layer are as follows:
1. At base layer l 0 , the number of neurons generated n l gn is equivalent to
the size of subpattern multiplied by the number of different elements v,
i.e., n l gn = s sp ×v.
2. At a middle layer l i , the number of neurons n l gn varies according to the
level of the layer i in the hierarchy, except for the top layer. Therefore,
n l gn = v (s sp −2i).
3. At the top layer l t , the number of processing neurons required is equiv-
alent to the number of different elements v. Hence, n l gn = v.
In the network generation stage, SI module is also responsible for initializing
DHGN subnets. The initialization involves communication of possible input
values to the base layer neurons before the actual store/recall operations can
start. The message communication between SI module and base layer neurons
(within each DHGN subnets) is conducted using a specific message communi-
cation protocol that has been developed for bitmap patterns. SI module sends
the possible input values to each DHGN subnet using the instruction, mes-
sage format. For example, if binary values are to be communicated then the
message would be initialize, (0,1).
Each initialization message received by the base layer neurons is used for
coordination within the base layer. Each neuron represents a specific position.
The following pseudo code shows the formation of the base layer neurons for
binary pattern recognition:
Note that the initialization process involves uploading distinct (value, po-
sition) pairs into the respective neurons for later use in the store/recall oper-
ations.
Search WWH ::




Custom Search