Biomedical Engineering Reference
In-Depth Information
where
4
yy
T
ʣ
yy
=
,
(8.83)
y
T
ʣ
y y
=
y
,
(8.84)
y
T
ʣ
y y
=
y
.
(8.85)
Similarly, the conditional entropy
H(
y
|
x
,
y
)
is expressed as
2
log
ʣ
yy
−
ʣ
y z
ʣ
−
1
z
,
1
T
y
H(
|
,
)
=
H(
|
)
=
z
ʣ
y
x
y
y
z
(8.86)
z
˜
˜
˜
y
T
x
T
T
where
z
=[
,
]
and
z
T
ʣ
y z
=
y
,
(8.87)
z
T
ʣ
z z
=
z
.
(8.88)
Thus, the transfer entropy
H
x
ₒ
y
is given by
H
x
ₒ
y
=
H(
y
|
y
)
−
H(
y
|
x
,
y
)
ʣ
yy
−
ʣ
y y
ʣ
−
1
T
y y
y
ʣ
1
2
log
˜
y
˜
ʣ
yy
−
ʣ
y z
ʣ
−
1
z
=
H(
y
|
y
)
−
H(
y
|
z
)
=
.
(8.89)
T
y
z
ʣ
z
˜
˜
˜
8.6.3 Equivalence Between Transfer Entropy and Granger
Causality
We will show that the transfer entropy and Granger causality are equivalent under the
Gaussianity assumption. The arguments here follow those in [
14
]. As discussed in
Sect.
8.3.2
, we consider the two forms of the regression to define Granger causality.
In the first regression,
y
(
t
)
is regressed using only its past values, such that
P
y
(
t
)
=
A
(
p
)
y
(
t
−
p
)
+
e
=
A
y
(
t
)
+
e
,
(8.90)
p
=
1
where
A
=
[
A
(
1
),...,
A
(
P
)
],
y
(
t
)
is defined in Eq. (
8.78
), and
e
is a residual vector.
In the second regression,
y
(
t
)
is regressed using not only its past values but also the
past values of
x
(
t
)
, such that
4
Note that in Sect. C.3.3 the expectation operator
E
[·]
is used, instead of the averaging operator
·
. They have the same meaning in the arguments here.