Information Technology Reference
In-Depth Information
Y 8h = 0.86−0.18*(−X 1 +X 3 *(0.52+X 3 +X 3 *(0.79−X 2 +X 3 −0.48
*(−X 1 +2*(X 3 8)))))
[5.29]
t 90% = X 3 −0.158*X 1 *(−0.09+(X 1 /X 3 )−X 3 +(X 3 *(X 4 5)/X 1 )
+(X 2 /(X 1 +X 2 −(0.83*X 1 /X 3 )+X 3 +X 4 )))
[5.30]
It was found that the prediction ability of GP on an external
validation set was higher compared to that of the ANNs.
5.5 Self-organizing maps
5.5.1 Introduction
Development of a self-organizing map (SOM) is an example of an
unsupervised learning process. Competitive, unsupervised self-organizing
learning is characterized by competition of neighboring cells in a neural
network in their activities, by means of mutual lateral interactions and
adaptive development into specifi c detectors of different signal patterns
(Kohonen, 1990).
Competitive learning is sometimes referred to as the 'winner takes all
strategy,' since each neuron competes with others during network
training, in order to best represent the data set (Dow et al., 2004).
5.5.2 Theory
￿
￿
￿
SOMs were fi rst introduced by Finnish professor Teuvo Kohonen in the
1980s. Their basic function is to reduce dimensionality of a complex
data set and present it visually in a simplifi ed manner, yet preserving its
topological properties. The whole data set is represented to SOM as
independent variables, and similarities among samples are then detected.
Complex, nonlinear relationships between high-dimensional data are
converted into a simple geometric relationship on a low-dimensional
display (Kohonen, 1998). Even though data are compressed, the most
important topological and metric relationships are preserved. SOMs are
also referred to as self-organized topological feature maps, since display of
the data set topology reveals relationships between members of the set
(Guha et al., 2004).
 
Search WWH ::




Custom Search