Information Technology Reference
In-Depth Information
Table 2.2 Cost per iteration of the MCA learning laws
Flops per Iteration
OJA +
8 n + 3
LUO
8 n + 1
EXIN
8 n + 1
OJAn
8 n
FENG
8 n 1
OJA
6 n
(flops 13 ) and shown in Table 2.2. OJA has the lowest cost. All costs depend on
the dimensionality n of the data. For high-dimensional data, all learning laws
have the same cost except OJA, which has a saving per iteration of 33% in flops.
2.8.2 Quantization Errors
Limited precision (quantization) errors can degradate the solution of gradient-
based algorithms with respect to the performance achievable in infinite precision.
These errors accumulate in time without bound, leading in the long term (tens
of millions of iterations) to an eventual overflow [23]. This type of divergence is
here called numerical divergence . There are two sources of quantization errors:
1. The analog-to-digital conversion used to obtain the discrete time-series
input . For a uniform quantization characteristics, the quantization is zero
mean.
2. The finite word length used to store all internal algorithmic quantities .This
error is not of zero mean. This mean is the result of the use of multiplication
schemes that either truncate or round products to fit the given fixed word
length.
The degradation of the solution is proportional to the conditioning of the
input (i.e., to the spread of the eigenvalue spectrum of the input autocorrelation
matrix). Hence, this problem is important for near-singular matrices (e.g., in the
application of MCA for the computation of the translation in computer vision)
[24]. For an example with FENG, see Figure 2.14, which shows the computation
of the smallest eigenvalue of a singular matrix whose eigenvalues are 0, 1, and
1.5. Notice the finite time divergence caused here by numerical problems. The
following remarks, demonstrated for the OLS algorithm in [23], are also valid
for the MCA learning laws.
Remark 76 (Slow Learning Rate) Decreasing the learning rate in the infinite-
precision algorithm leads to improved performance. Nevertheless, this decrease
increases the deviation from infinite-precision performance.
13 For a definition of flop, see [75, p. 19].
Search WWH ::




Custom Search