Please use this identifier to cite or link to this item:
DC FieldValue
dc.titleFlexibility and accuracy enhancement techniques for neural networks
dc.contributor.authorLI PENG (HT006960U)
dc.identifier.citationLI PENG (HT006960U) (2004-06-23). Flexibility and accuracy enhancement techniques for neural networks. ScholarBank@NUS Repository.
dc.description.abstractThis thesis focuses on techniques that improve flexibility and accuracy of Multiple Layer Perceptron (MLP) neural network. It covers three topic???In the first topic of the thesis, I proposed three Incremental Output Learning (IOL) algorithms for incremental output learning. In the second topic, I proposed a hierarchical incremental class learning (HICL) task decomposition method based on IOL algorithms. In this method, a -class problem is divided into sub-problems. Unlike other task decomposition methods, HICL can also maintain the useful correlation within the output attributes of a problem. In the last topic, I propose two feature selection techniques a?? Relative Importance Factor (RIF) and Relative FLD Weight Analysis (RFWA) for neural network with class decomposition. These approaches involved the use of Fishera??s linear discriminant (FLD) function to obtain the importance of each feature and find out correlation among features.
dc.subjectNeural Network Optimization Flexibility Accuracy Algorithm
dc.contributor.departmentELECTRICAL & COMPUTER ENGINEERING
dc.contributor.supervisorGUAN SHENG-UEI, STEVEN
dc.description.degreeconferredMASTER OF ENGINEERING
Appears in Collections:Master's Theses (Open)

Show simple item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
LiPeng.pdf631.72 kBAdobe PDF



Page view(s)

checked on May 18, 2019


checked on May 18, 2019

Google ScholarTM


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.