Please use this identifier to cite or link to this item:
Title: Task decomposition with pattern distributor networks
Keywords: Task Decomposition, Pattern Distributor, Reduced Pattern Training
Issue Date: 22-Dec-2008
Citation: BAO CHUNYU (2008-12-22). Task decomposition with pattern distributor networks. ScholarBank@NUS Repository.
Abstract: Task decomposition methods modularize a single large neural network into several modules. In the thesis, we proposed a new task decomposition method, namely Pattern Distributor (PD), for multilayered feedforward neural networks. Firstly, we presented the architecture of single-layer PD networks and developed a theoretical model to analyze the performance of single-layer PDs. The model and experimental results showed that compared with ordinary task decomposition methods (for example, Output Parallelism), PD networks can achieve better performance. Based on single-layer PDs, multi-layer PD structure was proposed. The network performance could further be improved using multi-layer PDs. The design of the distributor module was a key issue in the design of the PD networks. Thus, we explored the class combination problems for the distributor modules and the relations of the class number in the non-distributor modules and presented some theorems. Based on these theorems, we develop several greedy based combination algorithms for the distributor module. We also presented another two combination algorithms based on FLD and evolutionary algorithm. Experimental results showed these combination algorithms can effectively improve the classification accuracy of PD networks. Our PD method has wide applications. It can be easily transplanted to real-world problems.
Appears in Collections:Ph.D Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
Thesis.pdf1.61 MBAdobe PDF



Page view(s)

checked on Apr 20, 2019


checked on Apr 20, 2019

Google ScholarTM


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.