Biomedical Engineering Reference
In-Depth Information
WHEN CAPACITy IS EXCEEDED
This way, the learning of new concepts cor-
responds to the saturation of individuals, that
is, the only way to learn a general concept is by
presenting to the learner (the network, or the hu-
man brain, both cases have the same behaviour)
a great number of individuals satisfying that
concept.
For example, if a boy has to learn the concept
'chair', he will be presented a series of chairs, of
many styles and sizes, until he is able to recognize
a chair, and distinguish it from a table or a bed.
This work tries to explain what may happen
psychologically in the human brain. When a re-
duced number of patterns has to be memorized,
the brain is able to remember all of them when
necessary. Similarly, when the capacity of the net
is not exceeded, the net is able to retrieve exactly
the same patterns that were loaded into it. But
when the brain receives a great amount of data
to be recognized or classified, it distinguishes
between some groups of data (in an unsupervised
way) and thus forming concepts. This kind of
behaviour is also simulated by neural networks,
as we will show next.
Then, learning rules as Hebb's (or the more
general given by equation (3.1), where connection
between neurons is reinforced by the similarity of
their expected outputs), may produce classifiers
that discover some knowledge from the input pat-
terns, like the actual number of groups in which
the data are divided. Then, an unsupervised clus-
tering of the input pattern space is automatically
performed. This unsupervised clustering gener-
ates the concept space, formed by the equivalence
classes found for the input patterns.
If a pattern, say X , is to be loaded in the net,
by applying equation (3.1), a local minimum of
the energy function E is created at V = X . If another
pattern X' is apart from X , its load will create
another local minimum. But, if X and X' are close
each other, these two local minima created by the
learning rule will be merged, forming one local
minima instead.
Then, if a group of patterns is loaded into the
net (overflowing its capacity), and all of them are
close each other, only one local minimum will be
formed, and at the moment of retrieving these
data, the unique pattern to be retrieved will be
associated to the state of minimum energy. So,
patterns can be classified by the stable state of the
net which they converge to. This stable state can
be considered as a representative of the concept
associated to that group of patterns.
RELATIONSHIPS LEARNING
Equation for the learning rule, generalization
of Hebb's one, shows that the only thing taking
part in updating weight matrix is the pattern to be
loaded into the net at that time. So, it represents
a very 'local' information, and does not take ac-
count of the possible relationships that pattern
could have with the already stored ones. So, it is
convenient to introduce an additional mechanism
in the learning phase, such that the information
concerning to relationships between patterns is
incorporated in the update of the weight matrix. In
what follows, we will consider that the similarity
function is
f x y , that is, its value is
1 if x = y and -1 otherwise.
( ,
) = 2
1
x y
,
Relationships learning method: Suppose
that we have the (augmented) pattern X 1 stored in
the net. So, we have the weight matrix W = ( w i,j ).
If pattern X 2 is to be loaded into the network, by
applying equation (3.1), components of matrix
W are obtained.
If w i,j and ∆ W i,j have positive signum (both values
equal 1), it means that 1
X X = , indicat-
ing the relationship between components i and j of
X 1 and X 2 . If both are negative valued, something
similar happens, but with inequalities instead of
equalities.
and
X
=
X
i
1
j
2
i
2
j
Search WWH ::




Custom Search