Biomedical Engineering Reference
In-Depth Information
p
H(˃
T
,˃
S
)
=−
(
ʶ
T
,
ʶ
S
)
log
p
(
ʶ
T
,
ʶ
S
)
d
ʶ
T
d
ʶ
S
.
(7.35)
The mutual information between
˃
T
and
˃
S
is then defined as
I(˃
T
,˃
S
)
=
H(˃
T
)
+
H(˃
S
)
−
H(˃
T
,˃
S
).
(7.36)
When
˃
T
and
˃
S
are independent, we have
I(˃
T
,˃
S
)
=
0.
Under the assumption that
˃
T
and
˃
S
are complex Gaussian distributed, the
entropy is expressed as
log
2
H(˃
T
)
=
|
˃
T
|
,
(7.37)
log
2
H(˃
S
)
=
|
˃
S
|
.
(7.38)
The joint entropy is given by
log
˃
T
˃
S
ˆ
˃
T
˃
S
H(˃
T
,˃
S
)
=
log
|
˃
T
|
2
˃
T
˃
S
˃
S
˃
T
|
˃
S
|
2
=
log
2
2
2
˃
T
˃
S
=
|
˃
T
|
|
˃
S
|
−
(7.39)
Therefore, the mutual information between
˃
T
and
˃
S
is obtained as
I(˃
T
,˃
S
)
=
H(˃
T
)
+
H(˃
S
)
−
H(˃
T
,˃
S
)
|
˃
T
|
2
|
˃
S
|
2
=
2
.
log
(7.40)
|
˃
T
|
2
|
˃
S
|
2
−
˃
T
˃
S
It is easy to see that the mutual information
I(˃
T
,˃
S
)
is related to the magnitude
|
ˆ
|
coherence
such that
−|
ˆ
|
2
I(˃
T
,˃
S
)
=−
log
(
1
),
(7.41)
where
˃
T
˃
S
2
|
ˆ
|
2
=
|
˃
T
|
2
|
˃
S
|
2
.
(7.42)
Let us consider the factorization of the mutual information using the real and
imaginary parts of coherence. Using
|
ˆ
|
=
(ˆ)
+
(ˆ)
2
2
2
,wehave