Please use this identifier to cite or link to this item: https://doi.org/10.1109/TNN.2008.2000444
DC FieldValue
dc.titleHybrid multiobjective evolutionary design for artificial neural networks
dc.contributor.authorGoh, C.-K.
dc.contributor.authorTeoh, E.-J.
dc.contributor.authorTan, K.C.
dc.date.accessioned2014-06-17T02:52:15Z
dc.date.available2014-06-17T02:52:15Z
dc.date.issued2008
dc.identifier.citationGoh, C.-K., Teoh, E.-J., Tan, K.C. (2008). Hybrid multiobjective evolutionary design for artificial neural networks. IEEE Transactions on Neural Networks 19 (9) : 1531-1548. ScholarBank@NUS Repository. https://doi.org/10.1109/TNN.2008.2000444
dc.identifier.issn10459227
dc.identifier.urihttp://scholarbank.nus.edu.sg/handle/10635/56233
dc.description.abstractEvolutionary algorithms are a class of stochastic search methods that attempts to emulate the biological process of evolution, incorporating concepts of selection, reproduction, and mutation. In recent years, there has been an increase in the use of evolutionary approaches in the training of artificial neural networks (ANNs). While evolutionary techniques for neural networks have shown to provide superior performance over conventional training approaches, the simultaneous optimization of network performance and architecture will almost always result in a slow training process due to the added algorithmic complexity. In this paper, we present a geometrical measure based on the singular value decomposition (SVD) to estimate the necessary number of neurons to be used in training a single-hidden-layer feedforward neural network (SLFN). In addition, we develop a new hybrid multiobjective evolutionary approach that includes the features of a variable length representation that allow for easy adaptation of neural networks structures, an architectural recombination procedure based on the geometrical measure that adapts the number of necessary hidden neurons and facilitates the exchange of neuronal information between candidate designs, and a microhybrid genetic algorithm (μHGA) with an adaptive local search intensity scheme for local fine-tuning. In addition, the performances of well-known algorithms as well as the effectiveness and contributions of the proposed approach are analyzed and validated through a variety of data set types. © 2008 IEEE.
dc.description.urihttp://libproxy1.nus.edu.sg/login?url=http://dx.doi.org/10.1109/TNN.2008.2000444
dc.sourceScopus
dc.subjectArtificial neural network (ANN)
dc.subjectEvolutionary algorithms
dc.subjectLocal search
dc.subjectMultiobjective optimization
dc.subjectSingular value decomposition (SVD)
dc.typeArticle
dc.contributor.departmentELECTRICAL & COMPUTER ENGINEERING
dc.description.doi10.1109/TNN.2008.2000444
dc.description.sourcetitleIEEE Transactions on Neural Networks
dc.description.volume19
dc.description.issue9
dc.description.page1531-1548
dc.description.codenITNNE
dc.identifier.isiut000259499900003
Appears in Collections:Staff Publications

Show simple item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.