Please use this identifier to cite or link to this item:
|Title:||An effective method for generating multiple linear regression rules from artificial neural networks|
|Authors:||Setiono, R. |
|Source:||Setiono, R.,Azcarraga, A. (2001). An effective method for generating multiple linear regression rules from artificial neural networks. Proceedings of the International Conference on Tools with Artificial Intelligence : 171-178. ScholarBank@NUS Repository.|
|Abstract:||We describe a method for multivariate function approximation which combines neural network learning, clustering and multiple regression. Neural networks with a single hidden layer are universal function approximators. However, due to the complexity of the network topology and the nonlinear transfer function used in computing the activation of the hidden units, the predictions of a trained network are difficult to comprehend. On the other hand, predictions from a multiple linear regression equation are easy to understand but not accurate when the underlying relationship between the input variables and the output variable is nonlinear. The method presented in this paper generates a set of multiple linear regression equations using neural networks. The number of regression equations is determined by clustering the weighted input variables. The predictions for samples in the same cluster are computed by the same regression equation. Experimental results on real-world data demonstrate that the new method generates relatively few regression equations from the training data samples. The errors in prediction using these equations are comparable to or lower than those achieved by existing function approximation methods.|
|Source Title:||Proceedings of the International Conference on Tools with Artificial Intelligence|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Dec 16, 2017
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.