Please use this identifier to cite or link to this item:
|Title:||Incremental training based on input space partitioning and ordered attribute presentation with backward elimination||Authors:||Guan, S.-U.
Data presentation order
Input space partitioning
|Issue Date:||2005||Citation:||Guan, S.-U.,Ang, J.H. (2005). Incremental training based on input space partitioning and ordered attribute presentation with backward elimination. Journal of Intelligent Systems 14 (4) : 321-351. ScholarBank@NUS Repository.||Abstract:||A neural network training method ID-BT (Incremental Discriminatory Batch Training) is presented in this paper. The method separates the input space into two batches - significant and insignificant attributes-and before introducing them into the network, orders the attributes within each batch according to their individual discrimination ability. By backward eliminating insignificant attributes that are futile, the generalization accuracy of network training is increased. Incremental Discriminatory Batch and Individual Training (ID-BIT), which further improves ID-BT, introduces significant attributes individually and insignificant attributes as a batch. The architecture used for both methods employs several incremental learning algorithms. We tested our algorithm extensively, using several widely used benchmark problems, i.e. PROBEN1. The simulation results show that these two methods outperform incremental training with an increasing input dimension or conventional batch training where no partitioning of neural network input space occurs; we can achieve better network performance in terms of generalization accuracy.||Source Title:||Journal of Intelligent Systems||URI:||http://scholarbank.nus.edu.sg/handle/10635/56317||ISSN:||03341860|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Aug 18, 2019
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.