Please use this identifier to cite or link to this item:
Title: Structure analysis of neural networks
Keywords: geometrical interpretation, neural networks, multilayer perceptron, radial basis function network, structure analysis, over-fitting
Issue Date: 26-Oct-2004
Citation: DING SHENQIANG (2004-10-26). Structure analysis of neural networks. ScholarBank@NUS Repository.
Abstract: A geometrical interpretation of the multilayer perceptron (MLP) is suggested in this thesis. Some general guidelines for selecting the architecture of the MLP, i.e., the number of the hidden neurons and the hidden layers, are proposed based upon this interpretation and the controversial issue of whether four-layered MLP is superior to the three-layered MLP is also carefully examined. The guideline for architecture selection of MLP is then used to deal with the over-fitting problem. Various approaches of dealing with the over-fitting problem are also reviewed from this new geometrical interpretation. Finally, the attention is shifted to the radial basis function network (RBFN), which actually can be approximated or even represented by certain classes of MLPs with specified additional input. Various methods of determining the centers and other parameters of RBFN are compared with those implementations through MLPs.
Appears in Collections:Master's Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
thesis.pdf2.65 MBAdobe PDF



Page view(s)

checked on Nov 17, 2018


checked on Nov 17, 2018

Google ScholarTM


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.