Please use this identifier to cite or link to this item:
|Title:||Generating rules from trained network using fast pruning|
|Authors:||Setiono, Rudy |
Leow, Wee Kheng
|Citation:||Setiono, Rudy,Leow, Wee Kheng (1999). Generating rules from trained network using fast pruning. Proceedings of the International Joint Conference on Neural Networks 6 : 4095-4098. ScholarBank@NUS Repository.|
|Abstract:||Before symbolic rules are extracted from a trained neural network, the network is usually pruned so as to obtain more concise rules. Typical pruning algorithms require retraining the network which incurs additional cost. This paper presents FERNN, a fast method for extracting rules from trained neural networks without network retraining. Given a fully connected trained feedforward network, FERNN first identifies the relevant hidden units by computing their information gains. Next, it identifies relevant connections from the input units to the relevant hidden units by checking the magnitudes of their weights. Finally, FERNN generates rules based on the relevant hidden units and weights. Our experimental results show that the size and accuracy of the tree generated are comparable to those extracted by another method which prunes and retrains the network.|
|Source Title:||Proceedings of the International Joint Conference on Neural Networks|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Sep 29, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.