Database Reference
In-Depth Information
all streams (plus noise), i.e.,
x t,i = φ 1 , 1 x t− 1 , 1 + ... + φ 1 ,W x t−W, 1 +
... +
φ n, 1 x t− 1 ,n + ... + φ n,W x t−W,n + t .
(5.2)
4.2 Recursive Least Squares (RLS)
Recursive Least Squares ( RLS ) is a method that allows dynamic up-
date of a least-squares fit. The least squares solution to an overdeter-
minedsystemofequations Xb = y where X R
m×k (measurements),
k (regression coecients to be
estimated) is given by the solution of X T Xb = X T y . Thus, all we need
for the solution are the projections
m (output variables) and b R
y R
P X T X
q X T y
and
We need only space O ( k 2 + k )= O ( k 2 ) to keep the model up to date.
When a new row x m +1
k and output y m +1 arrive, we can update
R
P + x m +1 x T m +1 and
P
q
q + y m +1 x m +1 .
In fact, it is possible to update the regression coecient vector b without
explicitly inverting P to solve P b = P 1 q . In particular (see, e.g., [60])
the update equations are
(1 + x T m +1 Gx m +1 ) 1 Gx m +1 x T m +1 G
G G
(5.3)
b b Gx m +1 ( x T m +1 b
y m +1 ) ,
(5.4)
where the matrix G can be initialized to G I ,with asmallpositive
number and I the k
×
k identity matrix.
RLS and AR In the context of auto-regressive modeling (Eq. 5.1), we
have one equation for each stream value x w +1 ,...,x t ,... , i.e., the m -th
row of the X matrix above is
X m =[ x m− 1 x m− 2 ··· x m−w ] T
w
R
and z m = x m ,for t
w = m =1 , 2 ,... ( t>w ). In this case, the solution
vector b consists precisely of the auto-regression coecients in Eq. 5.1,
i.e.,
w .
RLS can be similarly used for multivariate AR model estimation.
φ w ] T
b =[ φ 1 φ 2
···
R
Search WWH ::




Custom Search