Please use this identifier to cite or link to this item:
|Title:||General class of neural networks|
|Authors:||Romaniuk, Steve G.|
|Citation:||Romaniuk, Steve G. (1994). General class of neural networks. IEEE International Conference on Neural Networks - Conference Proceedings 3 : 1331-1334. ScholarBank@NUS Repository.|
|Abstract:||Striving to derive minimal network architectures for neural networks has been at the center of attention for several years now. To this date numerous algorithms have been proposed to automatically construct networks. Unfortunately, these algorithms lack a fundamental theoretical analysis of their capabilities and only empirical evaluations on a few selected benchmark problems exist. Some theoretical results have been provided for small classes of well-known benchmark problems such as parity- and encoder-functions, but these are of lesser value due to their restrictiveness. In this work we describe a general class of 2-layer networks with 2 hidden units capable of representing a large set of problems. The cardinality of this class grows exponentially with regard to the inputs N. Furthermore, we outline a simple algorithm that allows us to determine, if any function (problem) is a member of this class. The class considered in this paper includes the benchmark problems parity and symmetry.|
|Source Title:||IEEE International Conference on Neural Networks - Conference Proceedings|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Oct 11, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.