Please use this identifier to cite or link to this item:
|Title:||Effective neural network pruning using cross-validation|
|Citation:||Huynh, T.Q.,Setiono, R. (2005). Effective neural network pruning using cross-validation. Proceedings of the International Joint Conference on Neural Networks 2 : 972-977. ScholarBank@NUS Repository. https://doi.org/10.1109/IJCNN.2005.1555984|
|Abstract:||This paper addresses the problem of finding neural networks with optimal topology such that their generalization capability is maximized. Our approach is to combine the use of a penalty function during network training and a subset of the training samples for cross-validation. The penalty is added to the error function so that the weights of network connections that are not useful have small magnitude. Such network connections can be pruned if the resulting accuracy of the network does not change beyond a preset level. Training samples in the cross-validation set are used to indicate when network pruning is terminated. Our results on 32 publicly available data sets show that the proposed method outperforms existing neural network and decision tree methods for classification. © 2005 IEEE.|
|Source Title:||Proceedings of the International Joint Conference on Neural Networks|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Jan 16, 2019
checked on Dec 29, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.