Please use this identifier to cite or link to this item:
|Title:||Towards minimal network architectures with evolutionary growth perceptrons||Authors:||Romaniuk, Steve G.||Issue Date:||1993||Citation:||Romaniuk, Steve G. (1993). Towards minimal network architectures with evolutionary growth perceptrons. Proceedings of the International Joint Conference on Neural Networks 1 : 717-720. ScholarBank@NUS Repository.||Abstract:||The purpose of this paper is twofold: First, it will show how the perceptron learning rule can be re-introduced as a local learning technique within the general framework of automatic network construction. Second, it will be pointed out how choosing the right training set during network construction can have profound affects on the quality of the created networks, in terms of number of hidden units and connections. The main vehicle for accomplishing this feat is the use of simple evolutionary processes for automatically determining the correct size of training sets and finding the right examples to train on during the various stages of network construction.||Source Title:||Proceedings of the International Joint Conference on Neural Networks||URI:||http://scholarbank.nus.edu.sg/handle/10635/134160||ISBN:||0780314212|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on May 22, 2019
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.