Information Technology Reference
In-Depth Information
N
i
=
1
[
ψ
(
n
x
(
i
)+
1
)+
ψ
(
n
y
(
i
)+
1
)]
.
1
N
I
knnr
(
X
;
Y
)=
ψ
(
k
)+
ψ
(
N
)
−
(19.13)
19.3.2 Nonlinear Interdependencies
Arnhold et al. [2] introduced the nonlinear interdependence measures for charac-
terizing directional relationships (i.e., driver and response) between two time se-
quences [2]. Given two time series
x
and
y
, using the method of delay we obtain the
delay vectors
x
n
=(
x
n
, ...,
x
n
−
(
m
−
1
)
τ
)
and
y
n
=(
x
n
, ...,
x
n
−
(
m
−
1
)
τ
)
, where
n
=
1
, ...
N
,
m
is the embedding dimension and
τ
denotes the time delay [34]. Let
r
n
,
j
and
s
n
,
j
,
k
denote the time indices of the
k
nearest neighbors of
x
n
and
y
n
. For each
x
n
, the mean Euclidean distance to its
k
neighbors is defined as
j
=
1
, ...,
k
j
=
1
(
x
n
−
x
r
n
,
j
)
1
k
R
n
(
2
X
)=
,
(19.14)
and the
Y
-conditioned mean squared Euclidean distance is defined by replacing the
nearest neighbors by the equal time partners of the closest neighbors of
y
n
:
k
j
=
1
(
x
n
−
x
s
n
,
j
)
1
k
R
(
k
)
2
(
X
|
Y
)=
.
(19.15)
n
5 is estimated using auto mutual information function,
the embedding dimension
m
For EEG, the delay
τ
=
=
10 is obtained using Cao's method and the Theiler
correction is set to
T
50 (Theiler correction corresponds to the
T
first sample
points omitted from our analysis) [3, 35]. If
x
n
has an average Euclidean radius
R
=
n
=
1
R
(
N
−
1
)
, then
R
(
k
)
R
(
k
)
N
(
X
)=(
1
/
N
)
∑
(
X
)
(
X
|
Y
)
≈
(
X
)
<
R
(
X
)
if the systems are
n
n
n
strongly correlated, while
R
(
k
)
R
(
k
)
(
X
|
Y
)
≈
R
(
X
)
>
(
X
)
if they are independent [24].
n
n
Accordingly, the interdependence measure
S
(
k
)
(
X
|
Y
)
can be defined as
R
(
k
)
N
n
=
1
1
N
(
X
)
n
S
(
k
)
(
X
|
Y
)=
.
(19.16)
R
(
k
)
(
X
|
Y
)
n
Since
R
(
k
)
R
(
k
)
(
X
|
Y
)
≥
(
X
)
by construction,
n
n
S
(
k
)
(
0
<
X
|
Y
)
≤
1
.
(19.17)
Low values of
S
k
indicate independence between
X
and
Y
, while high values
indicate synchronization. Arnhold et al. [2] introduced another nonlinear interde-
pendence measure
H
(
k
)
(
(
X
|
Y
)
X
|
Y
)
as
N
n
=
1
log
1
N
R
n
(
X
)
H
(
k
)
(
X
|
Y
)=
.
(19.18)
R
(
k
)
(
X
|
Y
)
n
H
(
k
)
(
0if
X
and
Y
are completely independent, while it is possible if closest
that closest in
Y
implies also closest in
X
for equal time indexes.
H
(
k
)
(
X
|
Y
)=
X
|
Y
)
would be