Biomedical Engineering Reference
In-Depth Information
Another criterion is provided by the Kullback-Leibler divergence (KL) which
provides some notion of asymmetric “distance” between two probabilities. Its com-
putation is numerically delicate but, in the present context of Gibbs distributions,
the following holds. If μ is the hidden (time-translation invariant probability) and
μ β a Gibbs distribution with a potential φ β , one has, [ 10 , 28 ]:
d ( μ, μ β )= P ( φ β )
μ [ φ β ]
h ( μ ) .
(8.30)
This allows in principle to estimate the divergence of our model to the hidden
probability μ , providing the exact spike train statistics. The smaller d ( μ, μ β ) the
better is the model. Unfortunately, since μ is unknown this criterion looks useless.
However, from Sect. 8.3.2.3 , μ [ φ β ] is well approximated by π ( T ω [ φ β ] which
can be computed from the raster. Additionally, the entropy h ( μ ) is unknown and
its estimation by numerical algorithms for a large number of neurons is delicate
[ 68 ]. However, when considering two statistical models μ β 1 β 2 with potentials
φ β 1 β 2 to analyze the same data, h ( μ ) is a constant (it only depends on data).
Thus, comparing these two models amounts to comparing P [ φ β 1 ]
π ( T )
[ φ β 1 ] and
ω
π ( T )
P [ φ β 2 ]
[ φ β 2 ]. Thus, the quantity
ω
h [ φ ]= P [ φ ]
π ( T )
ω
[ φ ] ,
(8.31)
provides a relative criterion to compare models, i.e., determining if model φ β 2
is
significantly “better” that model φ β 1 , reduces to the condition:
h [ φ β 2 ] h [ φ β 1 ] .
(8.32)
Its computation is detailed in [ 74 , 75 ].
8.4
Using Gibbs Distributions to Analysing Spike Trains
Statistics
In this section we show how the statistical tools presented in this chapter can be used
to analyze spike trains statistics. In the “challenge” section we mention the current
controversy about the question: Are G cells sensors independent encoders or, on
the opposite, are neural correlations important for coding? We present here recent
works where Gibbs distributions has been be used to address this question with
important implications on neural coding. However, as we show, those examples also
raise additional and fundamental questions. Some of them that can be addressed on
theoretical grounds, by studying neural networks models. A third section presents
an example of such a model where spike trains is known to have a Gibbs statistics
and where the potential is explicitly known. We compare those results to the current
state of the art in spike train analysis with Ising distributions.
Search WWH ::




Custom Search