Please use this identifier to cite or link to this item:
|Title:||Hierarchical incremental class learning with output parallelism|
Modular neural networks
|Citation:||Guan, S.-U.,Wang, K. (2007). Hierarchical incremental class learning with output parallelism. Journal of Intelligent Systems 16 (2) : 167-193. ScholarBank@NUS Repository.|
|Abstract:||The major drawback of a non-modular neural network classifier is its inability to cope with the increasing complexity of classification tasks. A modular neural network (MNN) classifier can eliminate the internal interference among hidden layers but it also ignores the useful information between classes. The hierarchical incremental class learning (HICL) scheme proposed recently for MNN classifiers further improves the performance by making use of the information between classes, but HICL still faces the presence of certain degree of harmful interference in the neural network. In this paper, we propose a new structure for modular neural network classifiers - Hierarchical Incremental Class Learning with Output Parallelism (HICL-OP), based on HICL and output parallelism. The proposed HICL-OP not only inherits the advantages of HICL, but also reduces the harmful interferences faced by HICL. The experiment results from several benchmark problems show that HICL-OP outperforms HICL and output parallelism, and it is especially effective for classification problems with multiple output attributes.|
|Source Title:||Journal of Intelligent Systems|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Nov 10, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.