Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/134160
DC FieldValue
dc.titleTowards minimal network architectures with evolutionary growth perceptrons
dc.contributor.authorRomaniuk, Steve G.
dc.date.accessioned2016-12-20T08:44:09Z
dc.date.available2016-12-20T08:44:09Z
dc.date.issued1993
dc.identifier.citationRomaniuk, Steve G. (1993). Towards minimal network architectures with evolutionary growth perceptrons. Proceedings of the International Joint Conference on Neural Networks 1 : 717-720. ScholarBank@NUS Repository.
dc.identifier.isbn0780314212
dc.identifier.urihttp://scholarbank.nus.edu.sg/handle/10635/134160
dc.description.abstractThe purpose of this paper is twofold: First, it will show how the perceptron learning rule can be re-introduced as a local learning technique within the general framework of automatic network construction. Second, it will be pointed out how choosing the right training set during network construction can have profound affects on the quality of the created networks, in terms of number of hidden units and connections. The main vehicle for accomplishing this feat is the use of simple evolutionary processes for automatically determining the correct size of training sets and finding the right examples to train on during the various stages of network construction.
dc.typeConference Paper
dc.contributor.departmentINFORMATION SYSTEMS & COMPUTER SCIENCE
dc.description.sourcetitleProceedings of the International Joint Conference on Neural Networks
dc.description.volume1
dc.description.page717-720
dc.description.coden85OFA
dc.identifier.isiutNOT_IN_WOS
Appears in Collections:Staff Publications

Show simple item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.