Please use this identifier to cite or link to this item: https://doi.org/10.1109/72.839020
Title: Extracting M-of-N rules from trained neural networks
Authors: Setiono, R. 
Issue Date: 2000
Source: Setiono, R. (2000). Extracting M-of-N rules from trained neural networks. IEEE Transactions on Neural Networks 11 (2) : 512-519. ScholarBank@NUS Repository. https://doi.org/10.1109/72.839020
Abstract: An effective algorithm for extracting M-of-N rules from trained feedforward neural networks is proposed. Two components of the algorithm distinguish our method from previously proposed algorithms which extract symbolic rules from neural networks. First, we train a network where each input of the data can only have one of the two possible values, -1 or one. Second, we apply the hyperbolic tangent function to each connection from the input layer to the hidden layer of the network. By applying this squashing function, the activation values at the hidden units are effectively computed as the hyperbolic tangent (or the sigmoid) of the weighted inputs, where the weights have magnitudes that are equal one. By restricting the inputs and the weights to binary values either -1 or one, the extraction of M-of-N rules from the networks becomes trivial. We demonstrate the effectiveness of the proposed algorithm on several widely tested datasets. For datasets consisting of thousands of patterns with many attributes, the rules extracted by the algorithm are surprisingly simple and accurate.
Source Title: IEEE Transactions on Neural Networks
URI: http://scholarbank.nus.edu.sg/handle/10635/42374
ISSN: 10459227
DOI: 10.1109/72.839020
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

SCOPUSTM   
Citations

75
checked on Dec 13, 2017

WEB OF SCIENCETM
Citations

52
checked on Dec 13, 2017

Page view(s)

598
checked on Dec 9, 2017

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.