Information Technology Reference
In-Depth Information
move away from localized measurements of a signal towards estimates of the pres-
ence of complex features. One can compare this to the fast Fourier transformation
that modifies a spatially localized representation step-by-step into a representation
that is localized in frequency.
If the Neural Abstraction Pyramid is used for classification, an output layer may
be added to the network that contains a single cell for each class to represent the
classification result in localized code. This final step of the transformation makes the
class-information explicit. The localized code facilitates access to the classification
result because it is easier to interpret than a distributed code, but it is not biologically
compatible. Note that such an output layer would not be necessary if the pyramidal
perception network would be followed by an action network with the shape of an
inverted pyramid that expands the abstract representations step-by-step to concrete
motor commands.
4.1.3 Local Recurrent Connectivity
Retinotopic projections mediate between the layers of the Neural Abstraction Pyra-
mid. Three types of projections are used in the network to compute a cell at position
( i,j ) in layer l :
- Forward projections originate in the adjacent lower layer ( l 1) . These pro-
jections have access to all features of the hyper-neighborhood centered at the
corresponding position (2 i, 2 j ) and are used for feature extraction.
- Lateral projections stay within a layer. They access all features at positions close
to ( i,j ) and make feature cell activities within a hyper-neighborhood consistent
to each other.
- Backward projections come from the hyper-neighborhood centered at position
( i/ 2 ,j/ 2) of the next higher layer ( l + 1) . They expand abstract features to less
abstract ones.
This local recurrent connection structure resembles the horizontal and vertical
feedback loops found in the cortex. The restriction to a local connectivity is neces-
sary to keep computational costs down [140]. Compared to a quadratic number of
possible connections, a local connection structure is much less expensive since it
is linear in the number of cells. In the hierarchical network this advantage is most
obvious in the lower layers, where the hyper-neighborhood of a cell contains only a
small fraction of all cells of a layer. Towards the top of the pyramid this advantage
is less striking, since the ratio between the number of cells in a hyper-neighborhood
and the total number of cells in a layer approaches one.
There are more advantages of a local connection structure than the low number
of connections alone. Local connections require only short wires when implemented
in hardware. They also facilitate the distribution of labor between parallel machines
with distributed memory. Even when simulating such networks on serial machines,
locality of memory access patterns is an advantage, since it increases the probability
of cache hits.
Search WWH ::




Custom Search