Please use this identifier to cite or link to this item:
|Title:||FERNN: an algorithm for fast extraction of rules from neural networks||Authors:||Setiono, R.
|Issue Date:||2000||Citation:||Setiono, R.,Leow, W.K. (2000). FERNN: an algorithm for fast extraction of rules from neural networks. Applied Intelligence 12 (1-2) : 15-25. ScholarBank@NUS Repository.||Abstract:||Before symbolic rules are extracted from a trained neural network, the network is usually pruned so as to obtain more concise rules. Typical pruning algorithms require retraining the network which incurs additional cost. This paper presents FERNN, a fast method for extracting rules from trained neural networks without network retraining. Given a fully connected trained feedforward network with a single hidden layer, FERNN first identifies the relevant hidden units by computing their information gains. For each relevant hidden unit, its activation values is divided into two subintervals such that the information gain is maximized. FERNN finds the set of relevant network connections from the input units to this hidden unit by checking the magnitudes of their weights. The connections with large weights are identified as relevant. Finally, FERNN generates rules that distinguish the two subintervals of the hidden activation values in terms of the network inputs. Experimental results show that the size and the predictive accuracy of the tree generated are comparable to those extracted by another method which prunes and retrains the network.||Source Title:||Applied Intelligence||URI:||http://scholarbank.nus.edu.sg/handle/10635/42905||ISSN:||0924669X|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on May 12, 2019
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.