Information Technology Reference
In-Depth Information
u
(
t
−
1)
,u
(
t
−
2)
,u
(
t
−
3)
,
···
,u
(
t
−
n
)
,
−
y
(
t
−
1)
,
−
y
(
t
−
2)
,
···
,
3
n
,
−
y
(
t
−
n
)]
T
∈
R
(6)
Using the following SG algorithm to estimate the parameter vector
θ
in (5):
1) +
ϕ
(
t
)
r
(
t
)
(
y
(
t
)
ˆ
(
t
)=
ˆ
T
(
t
)
ˆ
θ
θ
(
t
−
−
ϕ
θ
(
t
−
1))
,
(7)
ϕ
(
t
)=[
u
(
t
2)
,
u
(
t−
3)
h
(
t−
3)
,···,u
(
t−n
)
h
(
t−n
)
,u
(
t−
1)
,u
(
t−
2)
,u
(
t−
3)
,···,u
(
t−n
)
,
−
−
1)
h
(
t
−
1)
,u
(
t
−
2)
h
(
t
−
y
(
t
−
1)
,
−
y
(
t
−
2)
,
···
,
−
y
(
t
−
n
)]
T
,
(8)
2
,r
(0) = 1
.
r
(
t
)=
r
(
t
−
1) +
ϕ
(
t
)
(9)
1
2
:=
where
r
(
t
)
is the step-size and the norm of matrix
X
is defined by
X
tr[
XX
T
].
The convergence of the SG algorithm is relatively slower compared with the
recursive least squares algorithm. In order to improve the tracking performance
of the SG algorithm, we can introduce a
λ
in the SG algorithm to get the SG
algorithm with a forgetting factor (the FF-SG algorithm for short) as follows:
1) +
ϕ
(
t
)
r
(
t
)
(
y
(
t
)
ˆ
(
t
)=
ˆ
T
(
t
)
ˆ
θ
θ
(
t
−
−
ϕ
θ
(
t
−
1))
,
(10)
ϕ
(
t
)=[
u
(
t
−
1)
h
(
t
−
1)
,u
(
t
−
2)
h
(
t
−
2)
,
u
(
t
−
3)
h
(
t
−
3)
,
···
,u
(
t
−
n
)
h
(
t
−
n
)
,
u
(
t
−
1)
,u
(
t
−
2)
,u
(
t
−
3)
,
···
,u
(
t
−
n
)
,
−
y
(
t
−
1)
,
−
y
(
t
−
2)
,
···
,
−
y
(
t
−
n
)]
T
(11)
2
,
0
<λ<
1
,r
(0) = 1
.
r
(
t
)=
λr
(
t
−
1) +
ϕ
(
t
)
(12)
4 Example
Consider the following linear dynamic block,
0
.
1
q
−
1
]
y
(
t
)=[
q
−
1
+1
.
2
q
−
2
]
f
(
u
(
t
)) +
v
(
t
)
,
−
[1
the input
is taken as a persistent excitation signal sequence with zero
mean and unit variance, and
{
u
(
t
)
}
is taken as a white noise sequence with zero
mean and variance
σ
2
=0
.
10
2
, the piece-wise linearity is shown in Figure 1 and
with parameters:
m
1
=1,
m
2
=0
.
8. Then we have
{
v
(
t
)
}
θ
=[
m
1
−
m
2
)
,
0
.
5(
m
1
+
m
2
)
,
0
.
5
b
2
(
m
1
+
m
2
)
,a
1
]
T
m
2
,b
2
(
m
1
−
Search WWH ::
Custom Search