Please use this identifier to cite or link to this item:
https://doi.org/10.1016/j.neucom.2005.02.002
Title: | Output partitioning of neural networks | Authors: | Guan, S.-U. Yinan, Q. Tan, S.K. Li, S. |
Keywords: | Constructive learning algorithm Neural networks Output partitioning |
Issue Date: | Oct-2005 | Citation: | Guan, S.-U., Yinan, Q., Tan, S.K., Li, S. (2005-10). Output partitioning of neural networks. Neurocomputing 68 (1-4) : 38-53. ScholarBank@NUS Repository. https://doi.org/10.1016/j.neucom.2005.02.002 | Abstract: | Many constructive learning algorithms have been proposed to find an appropriate network structure for a classification problem automatically. Constructive learning algorithms have drawbacks especially when used for complex tasks and modular approaches have been devised to solve these drawbacks. At the same time, parallel training for neural networks with fixed configurations has also been proposed to accelerate the training process. A new approach that combines advantages of constructive learning and parallelism, output partitioning, is presented in this paper. Classification error is used to guide the proposed incremental-partitioning algorithm, which divides the original data set into several smaller sub-data sets with distinct classes. Each sub-data set is then handled in parallel, by a smaller constructively trained sub-network which uses the whole input vector and produces a portion of the final output vector where each class is represented by one unit. Three classification data sets are used to test the validity of this method, and results show that this method reduces the classification test error. © 2005 Published by Elsevier B.V. | Source Title: | Neurocomputing | URI: | http://scholarbank.nus.edu.sg/handle/10635/56977 | ISSN: | 09252312 | DOI: | 10.1016/j.neucom.2005.02.002 |
Appears in Collections: | Staff Publications |
Show full item record
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.