Biomedical Engineering Reference
In-Depth Information
Combining this with Eq. ( 7.23 ), we finally obtain
(ˆ)
ʾ =
1
(ˆ)
2
˃ T ˃ S
=
˃ T ˃ S
| ˃ T |
2
| ˃ S |
2
d 1 d 2 ) ˃ T ˃ S
(
1
=
˃ T ˃ S
2
2
(
1
d 1 d 2 )
| ˃ T |
| ˃ S |
(ˆ)
1
=
2 .
(7.32)
(ˆ)
The above equation indicates that the corrected imaginary coherence computed using
voxel spectra is exactly equal to the true corrected imaginary coherence. This implies
that the corrected imaginary coherence is unaffected by the algorithm leakage.
In the above arguments, the corrected imaginary coherence is introduced some-
what in an ad hoc manner. In the following, we derive the corrected imaginary coher-
ence in two different manners: factorization of the mutual coherence and regression
of the target signal with the seed signal. These derivations provide some insights into
the nature of corrected imaginary coherence.
7.5.2 Factorization of Mutual Information
Herewe show that the corrected imaginary coherence is derived from the factorization
of the mutual information into the instantaneous and non-instantaneous components.
We assume that
˃ S are Gaussian distributed, circular complex random vari-
ables. Concise explanations on the complex Gaussian random variable are found in
Sect. C.2 in the Appendix.
To derive the mutual information in the frequency domain, we define the entropy
of voxel spectra
˃ T and
˃ T and
˃ S . To do so, we first define real-valued 2
×
1 vectors
T and
T . Using these vectors, the entropy
ʶ T
=[ T ), T ) ]
ʶ S =[ S ), S ) ]
is defined for
˃ T and
˃ S such that,
p
H(˃ T ) =−
( ʶ T )
log p
( ʶ T )
d
ʶ T ,
(7.33)
p
H(˃ S ) =−
( ʶ S )
log p
( ʶ S )
d
ʶ S .
(7.34)
The entropy is a metric for uncertainty.
H(˃ T )
represents the uncertainty when
˃ T
is unknown, and
H(˃ S )
represents the uncertainty when
˃ S is unknown. The joint
entropy is defined as
 
Search WWH ::




Custom Search