Biomedical Engineering Reference
In-Depth Information
We can derive a similar formula to compute the transfer entropy. To do so, we use the
similarity between the transfer entropy and mutual information. Using Eq. (C.45),
the conditional entropy
H(
y
|
x
,
y
)
is rewritten as
H(
y
|
x
,
y
) = H(
y
,
x
|
y
) H(
x
|
y
).
(8.97)
Substituting this equation into Eq. ( 8.81 ), we get
H x y = H(
x
|
y
) + H(
y
|
y
) H(
y
,
x
|
y
).
(8.98)
Comparing the equation above with Eq. (C.46), it can be seen that the transfer entropy
is equal to the mutual information between y
(
t
)
and
x
(
t
)
, when
y
(
t
)
is given.
We define
ʣ u ,v | w
as
ʣ u ,v | w = ʣ u v ʣ u w ʣ 1
T
ww ʣ
vw .
(8.99)
Then, using Eq. (C.61), we can express
H(
y
|
y
)
, and
H(
x
|
y
)
, such that
2 log ʣ y , y | y ,
1
H(
y
|
y
) =
(8.100)
2 log ʣ x , x | y .
1
H(
x
|
y
) =
(8.101)
On the basis of Eq. (C.59), we also derive
2 log
.
T
˜
1
ʣ y , y | y ʣ
y
ʣ x , y | y ʣ x , x | y
x
,
y
H(
y
,
x
|
y
) =
(8.102)
Thus, substituting the equations above into Eq. ( 8.98 ), we obtain
1
2 log
1
y
H x y =
.
(8.103)
ʣ 1
y
y ʣ y , x | y ʣ 1
T
y
I
y ʣ
,
y
x
˜
, ˜
x
, ˜
x
Accordingly, defining the eigenvalues of a matrix,
ʣ 1
y
y ʣ y , x | y ʣ 1
T
y
y ʣ
y ,
(8.104)
, ˜
x
,
y
x
˜
, ˜
x
as
ˇ j ( j
=
1
,...,
d ), the transfer entropy is given by
d
1
2
1
H x y =
ˇ j .
log
(8.105)
1
j
=
1
 
Search WWH ::




Custom Search