Please use this identifier to cite or link to this item: https://doi.org/10.1109/ICCIS.2006.252267
Title: MultiLearner based recursive supervised training
Authors: Ramanathan, K.
Guan, S.U. 
Iyer, L.R.
Keywords: Backpropagation
Neural networks
Probabilistic neural networks (PNN)
Recursive percentage based hybrid pattern training (RPHP)
Supervised learning
Issue Date: 2006
Citation: Ramanathan, K.,Guan, S.U.,Iyer, L.R. (2006). MultiLearner based recursive supervised training. 2006 IEEE Conference on Cybernetics and Intelligent Systems : -. ScholarBank@NUS Repository. https://doi.org/10.1109/ICCIS.2006.252267
Abstract: In supervised learning, most single solution neural networks such as Constructive Backpropagation give good results when used with some datasets but not with others. Others such as Probabilistic Neural Networks (PNN) fit a curve to perfection but need to be manually tuned in the case of noisy data. Recursive Percentage based Hybrid Pattern Training (RPHP) overcomes this problem by recursively training subsets of the data, thereby using several neural networks. MultiLearner based Recursive Training (MLRT) is an extension of this approach, where a combination of existing and new learners are used and subsets are trained using the weak learner which is best suited for this subset. We observed that empirically, MLRT performs considerably well as compared to RPHP and other systems on benchmark data with 11% improvement in accuracy on the spam dataset and comparable performances on the vowel and the two-spiral problems. ©2006 IEEE.
Source Title: 2006 IEEE Conference on Cybernetics and Intelligent Systems
URI: http://scholarbank.nus.edu.sg/handle/10635/71044
ISBN: 1424400236
DOI: 10.1109/ICCIS.2006.252267
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.