Biomedical Engineering Reference
In-Depth Information
Finally a few remarks on the relationship between deterministic and stochastic dy-
namical systems. A deterministic neuron model is a special case of a stochastic neu-
ron model . When the noise term vanishes, a stochastic neuron model automatically
becomes deterministic. Usually there is a correspondence between the notations
of stochastic and deterministic dynamical systems. For example, for the Lyapunov
exponent introduced in the previous section for the deterministic system, we can in-
troduce analogous notation for a stochastic dynamical system (see for example [55]).
With the help of the Lyapunov exponent, we can understand some phenomena such
as how stochastic but not deterministic currents can synchronize neurons with differ-
ent initial states [20, 49].
1.4 Information theory
The nervous system is clearly a stochastic system [65], so we give a brief introduc-
tion of information theory [67]. Since neurons emit spikes randomly, we may ask
how to characterize their input-output relationships. The simplest quantity is the cor-
relation between input and output. However, information theory, with its roots in
communication theory, has its own advantage.
1.4.1
Shannon information
Intuitively, information is closely related to the element of surprise. Hence for an
event A ,wedefine
S
(
A
)=
log 2 (
P
(
A
))
as the (Shannon) information of the event A , so that the information in a certain event
is zero.
For a discrete random variable X with P
(
X
=
j
)=
p j , its entropy is the mean of
its information, i.e.,
Â
j
H
(
X
)=
p j log 2 (
p j )
=
1
When X is a continuous random variable its entropy is thus given by
H
(
X
)=
p
(
x
)
log 2 p
(
x
)
dx
where p
is the density of X . The notation of entropy in information theory was
first introduced by Claude Shannon, after the suggestion of John von Neumann. “You
should call it Entropy and for two reasons: first, the function is already in use in
thermodynamics under that name; second, and more importantly, most people don't
know what entropy really is, and if you use the word entropy in an argument you will
win every time!”.
(
x
)
 
Search WWH ::




Custom Search