Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/72507
DC FieldValue
dc.titleBackpropagation using generalized least squares
dc.contributor.authorLoh, A.P.
dc.contributor.authorFong, K.F.
dc.date.accessioned2014-06-19T05:08:50Z
dc.date.available2014-06-19T05:08:50Z
dc.date.issued1993
dc.identifier.citationLoh, A.P.,Fong, K.F. (1993). Backpropagation using generalized least squares. 1993 IEEE International Conference on Neural Networks : 592-597. ScholarBank@NUS Repository.
dc.identifier.isbn0780312007
dc.identifier.urihttp://scholarbank.nus.edu.sg/handle/10635/72507
dc.description.abstractThe backpropagation algorithm is essentially a steepest gradient descent type of optimization routine minimizing a quadratic performance index at each step. In this paper, the backpropagation algorithm is re-cast in the framework of Generalized Least Squares. The main advantage is that it eliminates the need to predict an optimal value for the step size required in the standard backpropagation algorithm. A simulation result on the approximation of a non-linear dynamical system is presented to show its rapid rate of convergence compared to the backpropagation algorithm.
dc.sourceScopus
dc.typeConference Paper
dc.contributor.departmentELECTRICAL ENGINEERING
dc.description.sourcetitle1993 IEEE International Conference on Neural Networks
dc.description.page592-597
dc.identifier.isiutNOT_IN_WOS
Appears in Collections:Staff Publications

Show simple item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.