Databases Reference
In-Depth Information
TABLE 7.1: Examples of Data Obtained from SI Modules, in the Form of
Errors for Each Feature
Pattern Class
Feature
1
2
3
4
5
1
5.26
4.25
1.78
0.85
3.99
2
21.03
3.25
9.36
10.05
2.01
represents linear search complexity. The following pseudocode outlines the
coordinator's voting function:
Algorithm 4 Coordinator Voting Scheme
1: MinF eature = 99.99
2: for i = 1 to MaxPatternClass do
3: for j = 1 to MaxFeatureNo do
4: if i.j.F eatErr ≤ MinF eature then
5: MinF eature = i.j.F eatErr
6: end if
7: end for
8: end for
This algorithm is used to find the minimum error obtained from the recog-
nition process for each feature. Similar to the vote counting function in the
SI module nodes, we can derive a Big-O notation for the coordinator's voting
function, f (v min ) as a n-polynomial function with executable instructions,
n = 2. Therefore, the following Big-O notation applies:
n 2
f (v min ) = O
(7.4)
Figure 7.3 shows the estimated execution time for the voting function for
10,000 pattern classes as a function of the number of features used, n feat . It
is assumed that the computation time of an instruction is 1 μs. Note that
the minimum voting function takes only one second to select the lowest error
from 100 features on 10,000 pattern classes (trained patterns). This voting
process exhibits higher scalability for recognition through the distribution of
recognition procedure on a group of collaborative DHGN networks.
The DHGN multi-feature recognition scheme allows recognition to be per-
formed in a scalable manner, extending its capability of using multiple pat-
tern features in the recognition procedure. By having a distributed architec-
ture in the recognition scheme, the DHGN provides an avenue for recogni-
tion/classification to be executed in a highly scalable manner, while main-
taining its low computational complexity. However, the proposed pre- and
Search WWH ::




Custom Search