Please use this identifier to cite or link to this item:
|Title:||Incremental self-growing neural networks with the changing environment|
|Keywords:||Cascade correlation networks|
|Citation:||Su, L.,Guan, S.U.,Yeo, Y.C. (2001). Incremental self-growing neural networks with the changing environment. Journal of Intelligent Systems 11 (1) : 43-74. ScholarBank@NUS Repository.|
|Abstract:||Conventional incremental learning approaches in multi-layered feedforward neural networks are based on new incoming training instances. In this paper, however, changing environment is defined as new incoming features of a specific problem. Our empirical study illustrates that ISGNN (incremental self-growing neural networks) can adapt to such a changing environment with new input dimension. In the meanwhile, dynamic neural network algorithms are used for automatic network structure design to avoid a time-consuming search for an appropriate network topology with the trial-and-error method. We also exploit information learned by the previous grown network to avoid retraining. Finally, we report simulation results on two benchmark problems. Our experiments show that this kind of adaptive learning mechanism could significantly improve the performance of original networks.|
|Source Title:||Journal of Intelligent Systems|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Nov 3, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.