Biomedical Engineering Reference
In-Depth Information
(
)
(
)
(
)
We denote two random vector time series x
t
and y
t
. Let us define vectors
x
t
(
)
and
y
t
as those made by concatenating their past values such that
x
(
t
1
)
y
(
t
1
)
.
x
(
t
2
)
y
(
t
2
)
x
(
t
) =
and
y
(
t
) =
(8.78)
.
.
x
(
t
P
)
y
(
t
P
)
We then define the conditional entropy of y
(
t
)
, given its past values
y
(
t
)
, such that
H(
y
|
y
) =−
p
(
y
,
y
)
log p
(
y
|
y
)
d y d
y
.
(8.79)
Similarly, we define the conditional entropy of y
(
t
)
, given the past values
x
(
t
)
and
y
(
t
)
, such that
H(
y
|
x
,
y
) =−
p
(
x
,
y
,
y
)
log p
(
y
|
x
,
y
)
d y d
x d
y
.
(8.80)
The transfer entropy
H x y is defined as
H x y = H(
y
|
y
) H(
y
|
x
,
y
)
log log p
(
y
|
x
,
y
)
=
p
(
x
,
y
,
y
)
d y d
x d
y
.
(8.81)
log p
(
y
|
y
)
In the equations above, the explicit time notation
(
t
)
is omitted for simplicity. In
Eq. ( 8.81 ),
H(
y
|
y
)
represents the uncertainty on the current value of y , when we
know
y , which is the past values of y .Also,
H(
y
|
x
,
y
)
represents the uncertainty on
the current value of y , when we know both
y , which are the past values of x
and y . Therefore, the transfer entropy is equal to the reduction of uncertainty of the
current value of y as a result of knowing the past values of x .
x and
8.6.2 Transfer Entropy Under Gaussianity Assumption
Assuming that the random vectors x and y follow a Gaussian distribution, using
Eq. (C.61), the conditional entropy
H(
y
|
y
)
is expressed as [ 14 ]
2 log ʣ yy ʣ y y ʣ 1
y y ,
1
T
H(
y
|
y
) =
y ʣ
(8.82)
˜
y
˜
 
 
Search WWH ::




Custom Search