Please use this identifier to cite or link to this item: https://doi.org/10.1016/S0893-6080(97)00151-2
Title: Effective learning in recurrent max-min neural networks
Authors: Teow, L.-N. 
Loe, K.-F. 
Keywords: Backpropagation
DFA extraction
Gradient descent
Grammatical inference
Learning
Max-min functions
Neural network
Recurrent architecture
Issue Date: Apr-1998
Citation: Teow, L.-N., Loe, K.-F. (1998-04). Effective learning in recurrent max-min neural networks. Neural Networks 11 (3) : 535-547. ScholarBank@NUS Repository. https://doi.org/10.1016/S0893-6080(97)00151-2
Abstract: Max and min operations have interesting properties that facilitate the exchange of information between the symbolic and real-valued domains. As such, neural networks that employ max-rain activation functions have been a subject of interest in recent years. Since max-min functions are not strictly differentiable, we propose a mathematically sound learning method based on using Fourier convergence analysis of side-derivatives to derive a gradient descent technique for max-min error functions. We then propose a novel recurrent max-min neural network model that is trained to perform grammatical inference as an application example. Comparisons made between this model and recurrent sigmoidal neural networks show that our model not only performs better in terms of learning speed and generalization, but that its final weight configuration allows a deterministic finite automation (DFA) to be extracted in a straightforward manner. In essence, we are able to demonstrate that our proposed gradient descent technique does allow max-min neural networks to learn effectively.
Source Title: Neural Networks
URI: http://scholarbank.nus.edu.sg/handle/10635/111170
ISSN: 08936080
DOI: 10.1016/S0893-6080(97)00151-2
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.