Information Technology Reference
In-Depth Information
Chapter 10
Attractors in Associative Memories
with Stochastic Weights
Abstract Neural associative memories are considered in which the elements of
the weight matrix are taken to be stochastic variables. The probability density
function of each weight is given by the solution of Schrödinger's diffusion equation.
The weights of the proposed associative memories are updated with the use of a
learning algorithm that satisfies quantum mechanics postulates. This learning rule
is proven to satisfy two basic postulates of quantum mechanics: (a) existence in
superimposing states, (b) evolution between the superimposing states with the use
of unitary operators. Taking the elements of the weight matrix of the associative
memory to be stochastic variables means that the initial weight matrix can be
decomposed into a superposition of associative memories. This is equivalent to
mapping the fundamental memories (attractors) of the associative memory into
the vector spaces which are spanned by the eigenvectors of the superimposing
matrices and which are related to each other via unitary rotations. In this way, it
can be shown that the storage capacity of the associative memories with stochastic
weights increases exponentially with respect to the storage capacity of conventional
associative memories.
10.1
Weights Learning in Associative Memories Is a Wiener
Process
10.1.1
The Weights of Associative Memories Are Equivalent
to Brownian Particles
In Sect. 8.2 a neural network with weights that follow the QHO model was
presented and stability of learning was analyzed. Interacting Brownian particles
stood in place of the neural weights. Now, it will be shown that this model is a
generalization of known associative memories, and since this model is compatible
with quantum mechanics postulates it can be considered as a quantum associative
Search WWH ::




Custom Search