Digital Signal Processing Reference
In-Depth Information
F(x,y )
Illustration 284: Backpropagation as the path of the largest gradient: potential mountains of the error
function
The output point decides whether a local minimum or even a global minimum, i.e. the lowest minimum of
the entire “landscape”, is reached by back propagation. The Illustration demonstrates the procedure for
the limit case of only two variable weightings.
u = ( 7 ; -5 ; 2,2 ; 0,5 ; 27 )
lies in a five-dimensional vector space which can be imagined structurally rather than
spatially. Thus, in the same Illustration, the frequencies of a periodic sawtooth form an
infinite- dimensional (orthogonal) vector space - according to the mathematical structure
of the so-called FOURIER- series which describes the sum of all sinusoidal waves of a
periodic signal.
This vector idea is now also applied to neural networks. If for instance 6 neurons form the
input layer, the six input signals altogether shape a six- dimensional vector space. Also,
the number n of output neurons or the signals exiting there have n- dimensions. If the six
parameters are selected in such a way that their information is not contained in the other
5 parameters, they are each (in a linear sense) independent of each other or, in addition,
even orthogonal, i.e standing vertically one upon the other as shown in Illustration 270 in
chapter 13.
Back propagation as an error correcting method, where weightings are modified, can be
interpreted geometrically as a calculation in a multidimensional space. In the two-
dimensional case with different weightings (!), a multi-vaulted plane with several
mountains and valleys often results (see Illustration 283).
Search WWH ::




Custom Search