Please use this identifier to cite or link to this item:
Title: Backpropagation using generalized least squares
Authors: Loh, A.P. 
Fong, K.F. 
Issue Date: 1993
Citation: Loh, A.P.,Fong, K.F. (1993). Backpropagation using generalized least squares. 1993 IEEE International Conference on Neural Networks : 592-597. ScholarBank@NUS Repository.
Abstract: The backpropagation algorithm is essentially a steepest gradient descent type of optimization routine minimizing a quadratic performance index at each step. In this paper, the backpropagation algorithm is re-cast in the framework of Generalized Least Squares. The main advantage is that it eliminates the need to predict an optimal value for the step size required in the standard backpropagation algorithm. A simulation result on the approximation of a non-linear dynamical system is presented to show its rapid rate of convergence compared to the backpropagation algorithm.
Source Title: 1993 IEEE International Conference on Neural Networks
ISBN: 0780312007
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.