Please use this identifier to cite or link to this item:
|Title:||Delta rule and learning for min-max neural networks|
|Source:||Zhang, Xinghu,Hang, Chang-Chieh,Tan, Shaohua,Wang, Pei-Zhuang (1994). Delta rule and learning for min-max neural networks. IEEE International Conference on Neural Networks - Conference Proceedings 1 : 38-43. ScholarBank@NUS Repository.|
|Abstract:||There have been a lot of works discussing (V, Λ)-neural network (see references). However, because of the difficulty of mathematical analysis for (V, Λ)-functions, most previous works choose bounded-plus (+) and multiply (*) as the operations of V and Λ. The (V, Λ) neural network with operators (+, *) is much easier than the (V, Λ) neural network with some other operators, e.g. min-max operators, because it has only a little difference from Back-propagation Neural Network. In this paper, we choose min and max as the operations of V and Λ. Because of the difficulty of functions involved with min and max operations, it is much difficult to deal with (V, Λ) neural network with operators (min, max). In Section 1 of this paper, we first discuss the differentiations of (V, Λ)-functions, and get that 'if f1(x), f2(x), ..., fn(x) are continuously differentiable in real number line R, then any function h(x) generated from f1(x), f2(x), ..., fn(x) through finite times of (V, Λ) operations is continuously differentiable almost everywhere in R'. This statement guarantee that the Delta Rule given in Section 2 is rational and effective. In Section 3 we implement a simple example to show that the Delta Rule given in Section 2 is capable to train (V, Λ) neural networks.|
|Source Title:||IEEE International Conference on Neural Networks - Conference Proceedings|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Feb 15, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.