Please use this identifier to cite or link to this item: http://scholarbank.nus.edu.sg/handle/10635/13836
Title: Flexibility and accuracy enhancement techniques for neural networks
Authors: LI PENG (HT006960U)
Keywords: Neural Network Optimization Flexibility Accuracy Algorithm
Issue Date: 23-Jun-2004
Source: LI PENG (HT006960U) (2004-06-23). Flexibility and accuracy enhancement techniques for neural networks. ScholarBank@NUS Repository.
Abstract: This thesis focuses on techniques that improve flexibility and accuracy of Multiple Layer Perceptron (MLP) neural network. It covers three topic???In the first topic of the thesis, I proposed three Incremental Output Learning (IOL) algorithms for incremental output learning. In the second topic, I proposed a hierarchical incremental class learning (HICL) task decomposition method based on IOL algorithms. In this method, a -class problem is divided into sub-problems. Unlike other task decomposition methods, HICL can also maintain the useful correlation within the output attributes of a problem. In the last topic, I propose two feature selection techniques a?? Relative Importance Factor (RIF) and Relative FLD Weight Analysis (RFWA) for neural network with class decomposition. These approaches involved the use of Fishera??s linear discriminant (FLD) function to obtain the importance of each feature and find out correlation among features.
URI: http://scholarbank.nus.edu.sg/handle/10635/13836
Appears in Collections:Master's Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
LiPeng.pdf631.72 kBAdobe PDF

OPEN

NoneView/Download

Page view(s)

145
checked on Dec 11, 2017

Download(s)

135
checked on Dec 11, 2017

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.