Information Technology Reference
In-Depth Information
Parameter ξ is a small arbitrary factor ranging in our case from 0.1 to 0.0001.
This equation means that the more the neuron keeps firing, the higher will be
the rightward shift of the activation function, leading to a moderation of the
neurons' firing probability in the future. Conversely, if the firing probability is
low, the sigmoid will move leftwards, thereby increasing the probability of the
neuron's firing in the future. Parameter ξ works as a learning factor. If we want
quick convergence of the shifts with little interest in accuracy, we select a higher
ξ . However if we are not worried about the duration of learning and we are
mainly interested in accuracy, ξ must be set to a very small value.
Competition Between Neurons. Another neural mechanism used in our
simulation, is competition between neurons. In biological neural networks, in-
hibitory neurons like basket or chandelier cells are able to reduce the activity
of neighboring excitatory cell so that a few excitatory ones remain active. This
competition allows to find the most activated neurons inside a neural pool. In
our case the most activated neuron remains active while remaining ones are
inactivated.
Fig. 4. According to intrinsic plasticity, low neuron activation (cases a,b,c) contributes
to a leftward shift of the sigmoidal activation function. On the contrary high neuron
activation (cases d,e and f) contributes to a rightward shift of this sigmoidal function.
3
Results with Different Kinds of Stimuli
As mentioned in the introduction, the purpose of our work is to apply the inclined
floor stimulation technique to a robot for understanding how this technique
contributes to accelerating crawling abilities in children.
Search WWH ::




Custom Search