Information Technology Reference
In-Depth Information
10
2
log
exp
5
1.5
1
0
.5
-5
sqr
sqrt
0
-10
-2
-1
0
1
2
-10
-5
0
5
10
(a)
(b)
x
x
Fig. 4.7. Other transfer functions: (a) square and square root; (b) logarithm and exponential.
Q kl
a ijkl = e v kl · P kl
Y
X
kl ) v kl .
kl a t
w pq
i j k l + w p 0
(
(4.9)
p =1
q =1
PQ
Such
-units resemble the alternating operations of the sum-product algorithm
that implements Kalman filters, hidden Markov models and fast Fourier analysis in
factor graphs [129]. For that reason, it is important that the basic processing element
can implement products of sums.
The transfer functions discussed above are not the only possible choices. Never-
theless, they illustrate the possibilities to create representations with different prop-
erties and network dynamics with different behavior in the Neural Abstraction Pyra-
mid framework.
4.3 Example Networks
To illustrate the possible use of the Neural Abstraction Pyramid architecture, the
following section presents some small example networks that were designed manu-
ally.
4.3.1 Local Contrast Normalization
The first example focuses on horizontal and vertical interaction in a hierarchy, but
does not yet increase the number of features when decreasing the resolution. It im-
plements local contrast normalization in the Neural Abstraction Pyramid.
Contrast normalization helps to overcome the limited dynamic range of linear
image sensors. Typical sensors measure the local intensity with an accuracy of 8 bits
per pixel. This leads to problems if very high and very low intensities are present
simultaneously in an image. Figure 4.8(a) shows such a problematic image, taken
with an entry-level digital still camera. The foreground is very dark, while the back-
ground, visible through the window, is very bright. The limited dynamic range of
the camera impairs the visibility of details. Global normalization of brightness does
Search WWH ::




Custom Search