Please use this identifier to cite or link to this item: https://doi.org/10.1016/j.neucom.2007.10.012
Title: Interference-less neural network training
Authors: Hua Ang, J.
Guan, S.-U.
Tan, K.C. 
Mamun, A.A. 
Keywords: Attribute interference
Classification
Input space partitioning
Neural networks
Issue Date: Oct-2008
Citation: Hua Ang, J., Guan, S.-U., Tan, K.C., Mamun, A.A. (2008-10). Interference-less neural network training. Neurocomputing 71 (16-18) : 3509-3524. ScholarBank@NUS Repository. https://doi.org/10.1016/j.neucom.2007.10.012
Abstract: The lack of segregation of input space for conventional neural networks (NNs) training often causes interference within the network. The interference-less neural network training (ILNNT) method employed in this paper reduces interference among input attributes by identifying those attributes that interfere with one another and separating them, while attributes that are mutually beneficial are grouped together. Separated attributes in different batches do not share the same hidden neurons while attributes within a batch are connected to the same hidden neurons. ILNNT is applied to widely used benchmark binary and multi-class classification problems and experimental results from K-fold cross validation show that there exist varying degrees of interference among the attributes for the datasets used and the classification accuracy produced by NNs with reduced interference is high. © 2007 Elsevier B.V. All rights reserved.
Source Title: Neurocomputing
URI: http://scholarbank.nus.edu.sg/handle/10635/70648
ISSN: 09252312
DOI: 10.1016/j.neucom.2007.10.012
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.