Please use this identifier to cite or link to this item:
Title: FastProp: A selective training algorithm for fast error propagation
Authors: Wong, F.S. 
Issue Date: 1991
Citation: Wong, F.S. (1991). FastProp: A selective training algorithm for fast error propagation : 2038-2043. ScholarBank@NUS Repository.
Abstract: An improved backpropagation algorithm, called FastProp, for training a feedforward neural network is described. The unique feature of the algorithm is the selective training which is based on the instantaneous causal relationship between the input and output signals during the training process. The causal relationship is calculated based on the error backpropagated to the input layers. The accumulated error, referred to as the accumulated error indices (AEIs), are used to rank the input signals according to their correlation relation with the output signals. An entire set of time series data can be clustered into several situations based on the current input signal which has the highest AEI index, and the neurons can be activated based on the current situations. Experimental results showed that a significant reduction in training time can be achieved with the selective training algorithm compared to the traditional backpropagation algorithm.
ISBN: 0780302273
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Page view(s)

checked on Sep 22, 2022

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.