Please use this identifier to cite or link to this item:
Title: Effective neural network pruning using cross-validation
Authors: Huynh, T.Q.
Setiono, R. 
Issue Date: 2005
Citation: Huynh, T.Q.,Setiono, R. (2005). Effective neural network pruning using cross-validation. Proceedings of the International Joint Conference on Neural Networks 2 : 972-977. ScholarBank@NUS Repository.
Abstract: This paper addresses the problem of finding neural networks with optimal topology such that their generalization capability is maximized. Our approach is to combine the use of a penalty function during network training and a subset of the training samples for cross-validation. The penalty is added to the error function so that the weights of network connections that are not useful have small magnitude. Such network connections can be pruned if the resulting accuracy of the network does not change beyond a preset level. Training samples in the cross-validation set are used to indicate when network pruning is terminated. Our results on 32 publicly available data sets show that the proposed method outperforms existing neural network and decision tree methods for classification. © 2005 IEEE.
Source Title: Proceedings of the International Joint Conference on Neural Networks
ISBN: 0780390482
DOI: 10.1109/IJCNN.2005.1555984
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.


checked on Apr 14, 2021

Page view(s)

checked on Apr 14, 2021

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.