Digital Signal Processing Reference
In-Depth Information
have additional structure, such as time structure (e.g. speech signals) or
higher-dimensional dependencies (e.g. images).
In the next section we will define what it means to have this addi-
tional time structure and how to build algorithms that specifically use
this information. This means that the sample order of our signals is now
relevant.
Stochastical processes
Definition 4.4 Stochastical process: A sequence of random
vectors x ( t ) ,t =1 , 2 ,... is called a discrete stochastical process .The
process ( x ( t )) t is said to be i.i.d. if the x ( t ) are identically distributed
and independent. A realization or path of ( x ( t )) t
n -
is given by the
R
sequence
x (1)( ω ) , x (2)( ω ) ,...
for any ω
Ω.
The expectation of the process is simply the sequence of the expec-
tations of the random vectors, and similarly for the covariance of the
process , in particular for the variance:
E (( x ( t )) t ):= E ( x ( t ))) t
Cov (( x ( t )) t ):=(Cov x ( t ))) t
So far we have not yet used the time structure. Now we introduce a
new term which makes sense only if this additional structure is present.
Given τ
N ,for t>τ we define the autocovariance of ( x ( t )) t to be
the sequence of matrices
C τ
:= (Cov( x ( t ) , x ( t
τ ))) t
and the autocorrelation to be
R τ
:= (Cor( x ( t ) , x ( t
τ ))) t .
Consider the what we now call the instantaneous mixing model
x ( t ):= As ( t )
for n -dimensional stochastic processes s and x , and mixing matrix
A
Gl( n ). Now we do not need s ( t ) to be independent for every t ,
Search WWH ::




Custom Search