ABSTRACT

This chapter presents Laguerre-polynomial neuronet Laguerre-polynomial neuronet model and the corresponding theoretical basis. Numerical studies including comparisons substantiate the efficacy and superiority of Laguerre-polynomial neuronet Laguerre-polynomial neuronet with the proposed weights-and-structure-determination (WASD) algorithm. Numerical studies substantiate the efficacy of such a Laguerre-polynomial neuronet with the WASD algorithm. Numerical studies including comparisons substantiate the efficacy and superiority of Laguerre-polynomial neuronet Laguerre-polynomial neuronet with the proposed WASD algorithm. The chapter explains numerical results which substantiate the efficacy and superior performance of the presented WASD algorithm for Laguerre-polynomial neuronet Laguerre-polynomial neuronet. Moreover, numerical results have substantiated the efficacy of the constructed Laguerre-polynomial neuronet Laguerre-polynomial neuronet with the WASD algorithm. The small testing errors show that this Laguerre-polynomial neuronetLaguerre-polynomial neuronet possesses very good generalization ability. The conventional back-propagationback-propagation algorithm is essentially a gradient-descent optimization strategy, which adjusts the weights to bring the input/output behavior into a desired mapping for a neuronet as of some application environment.