Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/99162
DC FieldValue
dc.titleA penalty-function approach for pruning feedforward neural networks
dc.contributor.authorSetiono, R.
dc.date.accessioned2014-10-27T06:01:17Z
dc.date.available2014-10-27T06:01:17Z
dc.date.issued1997-01-01
dc.identifier.citationSetiono, R. (1997-01-01). A penalty-function approach for pruning feedforward neural networks. Neural Computation 9 (1) : 185-204. ScholarBank@NUS Repository.
dc.identifier.issn08997667
dc.identifier.urihttp://scholarbank.nus.edu.sg/handle/10635/99162
dc.description.abstractThis article proposes the use of a penalty function for pruning feedforward neural network by weight elimination. The penalty function proposed consists of two terms. The first term is to discourage the use of unnecessary connections, and the second term is to prevent the weights of the connections from taking excessively large values. Simple criteria for eliminating weights from the network are also given. The effectiveness of this penalty function is tested on three well-known problems: the contiguity problem, the parity problems, and the monks problems. The resulting pruned networks obtained for many of these problems have fewer connections than previously reported in the literature.
dc.sourceScopus
dc.typeArticle
dc.contributor.departmentINFORMATION SYSTEMS & COMPUTER SCIENCE
dc.description.sourcetitleNeural Computation
dc.description.volume9
dc.description.issue1
dc.description.page185-204
dc.identifier.isiutNOT_IN_WOS
Appears in Collections:Staff Publications

Show simple item record
Files in This Item:
There are no files associated with this item.

Page view(s)

76
checked on May 2, 2021

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.