Please use this identifier to cite or link to this item:
|Title:||Interference-less neural network training|
|Authors:||Hua Ang, J.|
Input space partitioning
|Citation:||Hua Ang, J., Guan, S.-U., Tan, K.C., Mamun, A.A. (2008-10). Interference-less neural network training. Neurocomputing 71 (16-18) : 3509-3524. ScholarBank@NUS Repository. https://doi.org/10.1016/j.neucom.2007.10.012|
|Abstract:||The lack of segregation of input space for conventional neural networks (NNs) training often causes interference within the network. The interference-less neural network training (ILNNT) method employed in this paper reduces interference among input attributes by identifying those attributes that interfere with one another and separating them, while attributes that are mutually beneficial are grouped together. Separated attributes in different batches do not share the same hidden neurons while attributes within a batch are connected to the same hidden neurons. ILNNT is applied to widely used benchmark binary and multi-class classification problems and experimental results from K-fold cross validation show that there exist varying degrees of interference among the attributes for the datasets used and the classification accuracy produced by NNs with reduced interference is high. © 2007 Elsevier B.V. All rights reserved.|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Feb 15, 2019
WEB OF SCIENCETM
checked on Feb 6, 2019
checked on Feb 2, 2019
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.