Database Reference
In-Depth Information
MillerUpdatingRegression(k, true );
for ( int i=0;i<1000;i++) {
double base = Math. sin (t);
y = base + 0.5*twist.nextGaussian();
//Update the input time series
if (i >= x.length) {
rm.addObservation(x, y);
if (rm.getN() > 2*x.length) {
double B[] =
rm.regress().getParameterEstimates();
double y2 = B[0];
for ( int j=0;j<x.length;j++)
if (!Double. isNaN (B[j+1])) y2 += B[j+1]*x[j];
for ( int j=0;i<x.length-1;j++) x[i] = x[i+1];
x[x.length-1]=y;
}
} else {
x[i] = y;
}
t += 2.0*Math. PI /60.0;
}
}
This example uses the Miller method, which is a technique for updating
the QR factorization. It would also be possible to use the gradient descent
technique discussed earlier, especially if you're using logistic regression
instead of linear regression. However, this version works passably well, as
shown in Figure 11.6 . That said, neural networks often perform better at
this task, as shown in the next section. In Figure 11.6 the original signal is
identified byadottedline; theobserved data isrepresented bythegray dots;
and the predicted values are identified by the solid line.
 
Search WWH ::




Custom Search