Information Technology Reference
In-Depth Information
chapter concludes with the performance analysis over selected class of problems.
The functional mapping properties of two-dimensional neurons are further extended
in this chapter for three-dimensional neuron. The three-dimensional face recognition
problem discussed in the chapter opens a path for more biometric applications.
Chapter 7 introduces an approach for evaluating, monitoring, and maintaining
the stability of adaptive learning machine for prospective applications. The approach
allows us to evaluate the capability of supervised and unsupervised learning of neural
units in complex domain that are a fundamental class of high-dimensional neuro-
computing. The chapter starts with an overview of selected data preprocessing and
feature extraction. It presents PCA and ICA algorithm in real and complex domain
for feature extraction along with optimal neural recognizer (OCON: One-Class-in-
One-Neuron) for statistical analysis of data. The readers will be most interested in
how this topic brings about the computation capability of single neuron and how
one can apply considered techniques for generating the large-scale realistic simu-
lation. The machine learning applications have been analyzed by typical real life
problem of biometric applications. The illustrative examples in the topic demon-
strate the improvement in learning capability in terms of speed and performance
with less number of neurons and learning parameters as compared to the standard
neural networks, hence reduces the overall burden of the network.
References
1. McCulloch, W.S., Pitts, W.: A logical calculation of the ideas immanent in nervous activity.
Bull. Math. Biophys. 5 , 115-133 (1943)
2. Hebb, D.O.: The Organization of Behavior. Wiley, New York (1949)
3. Rosenblatt, F.: The percepton : a probabilistic model for information storage and organization
in the brain. Psychol. Rev. 65 , 231-237 (1958)
4. Tripathi, B.: Novel neuron models in complex domain and applications, PhD Thesis (Thesis
defended on 21 April 2010), Indian Intitute of Technology, Kanpur (2010)
5. Kim, T., Adali, T.: Approximation by fully complex multilayer perceptrons. Neural Comput.
15 , 1641-1666 (2003)
6. Isokawa, T., Nishimura, H., Matsui, N.: Quaternionic multilayer perceptron with local analyt-
icity. Information, 3 , 756-770 (2012)
7. Matsui, N., Isokawa, T., Kusamichi, H., Peper, F., Nishimura, H.: Quaternion neural network
with geometrical operators. J. Intell. Fuzzy Syst. 15 , 149-164 (2004)
8. Nitta, T.: An extension of the back-propagation algorithm to complex numbers. Neural Netw.
10 (8), 1391-1415 (1997)
9. Tripathi, B.K., Kalra, P.K.: Complex generalized-mean neuron model and its applications.
Appl. Soft Comput. 11 (01), 768-777 (2011)
10. Hirose, A.: Complex-Valued Neural Networks. Springer, New York (2006)
11. Tripathi, B.K., Kalra, P.K.: The novel aggregation function based neuron models in complex
domain. Soft Comput. 14 (10), 1069-1081 (2010)
12. Georgiou, G.M., Koutsougeras, C.: Complex domain backpropagation. In: IEEE transaction
on circuits and systems-II: Analog and Digital Signal Processing, vol. 39, no. 5, pp. 330-334,
May 1992
13. Aizenberg, N.N., Yu, I.L., Pospelov, D.A.: About one generalization of the threshold function.
Doklady Akademii Nauk SSSR (The Reports of the Academy of Sciences of the USSR) 196 (6),
1287-1290 (1971) (in Russian)
 
Search WWH ::




Custom Search