Please use this identifier to cite or link to this item:
https://scholarbank.nus.edu.sg/handle/10635/44940
Title: | Proper and effective training of a pattern recognizer for cyclic data | Authors: | Hwarng, H.Brian | Issue Date: | 1995 | Citation: | Hwarng, H.Brian (1995). Proper and effective training of a pattern recognizer for cyclic data. IIE Transactions (Institute of Industrial Engineers) 27 (6) : 746-756. ScholarBank@NUS Repository. | Abstract: | A new approach to training backpropagation neural networks for identifying cyclic patterns on control charts is presented. The objectives of this research are to show that building an effective cyclic-pattern-recognition neural network requires proper training strategies and to demonstrate how these strategies, namely, incremental and decremental training, should be applied and how the performance can be improved with additional statistics. A series of experiments were conducted to study the effect of the number of output pattern classes and the effect of noise on network training and performance. Experiments show that reducing the number of output pattern classes to a small number, e.g., four or fewer, does not guarantee effective learning, and that the noise added to the training data should be maintained at a reasonable level to achieve a balanced performance. Further incorporation of harmonic amplitude statistics (HAS) also proved that the proper use of statistics adopted from Fourier analysis can improve the performance of a cyclic-pattern-recognition neural network. This study offers valuable insights as to how to construct and train a back-propagation neural network properly and effectively for detecting cyclic patterns. | Source Title: | IIE Transactions (Institute of Industrial Engineers) | URI: | http://scholarbank.nus.edu.sg/handle/10635/44940 | ISSN: | 0740817X |
Appears in Collections: | Staff Publications |
Show full item record
Files in This Item:
There are no files associated with this item.
Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.