Robotics Reference
In-Depth Information
and early 1990s Thaler had the idea of investigating what would happen
to an ANN if he tried to “kill” it by randomly degrading the weights con-
necting the artificial neurons. He trained a network, then held its input
information constant and observed what happened to the output, while,
one by one, he switched off the network's connections or reduced their
strength at random—the computer equivalent of killing individual con-
nections between the neurons in a human brain. Intuitively one would
expect this to have the effect of completely destroying the performance
of the network, but instead the stunted versions were not incapable, they
merely performed differently.
The changes in the internal state of Thaler's network were interpreted
as though they were caused by changes in the input information rather
than by the death of some of its neurons, and the network made a stab
at what the outputs might be, much as a human might guess a word that
has letters missing. Thaler discovered that, as the neurons “died”, the
network generated memories, then fragments of memories, and finally
new concepts created from fragments of memories. He also found that
an ANN will respond in the same way if, instead of deleting connec-
tions, he simply changed some of their weights. His first breakthrough
came on Christmas Eve in 1989, when he typed the lyrics of some of his
favourite Christmas carols into an ANN. Once the network had learned
these songs he started to switch off its connections. Gradually the net-
work “began to hallucinate”, creating new ideas in the process. As it was
degrading, the network dreamed up new carols, each of which was cre-
ated from shards of its broken memories. One of its final creations was
the line “All men go to good earth in one eternal silent night.” What
most intrigued Thaler about the program's dying gasps was how creative
the process of dying could be. This prompted the idea: “What if I don't
cut the connections, but just perturb them a little?” [4]
Thaler tried “tickling” a few of the connections in the network, a
process akin to giving a human a shot of adrenaline or a small electri-
cal jolt to the brain. These tickling disturbances (called noise) caused
his ANNs to generate variations on whatever the original network was
trained to produce. For example, in order to generate silhouettes of
car shapes, Thaler first provided the network with the positions of the
salient points of a car's profile, such as the top and bottom of its wind-
screen. As the network was being trained, some of its nodes came to rep-
resent particular components of a car's shape, while the weights on the
connections represented ways in which these various components can be
Search WWH ::




Custom Search