Please use this identifier to cite or link to this item:
https://doi.org/10.1057/palgrave.jors.2601807
Title: | Automatic knowledge extraction from survey data: Learning M-of-N constructs using a hybrid approach | Authors: | Setiono, R. Pan, S.-L. Hsieh, M.-H. Azcarraga, A. |
Keywords: | Decision trees M-of-N constructs Neural networks |
Issue Date: | 2005 | Citation: | Setiono, R., Pan, S.-L., Hsieh, M.-H., Azcarraga, A. (2005). Automatic knowledge extraction from survey data: Learning M-of-N constructs using a hybrid approach. Journal of the Operational Research Society 56 (1) : 3-14. ScholarBank@NUS Repository. https://doi.org/10.1057/palgrave.jors.2601807 | Abstract: | Data collected from a survey typically consist of attributes that are mostly if not completely binary-valued or binary-encoded. We present a method for handling such data where the underlying data analysis can be cast as a classification problem. We propose a hybrid method that combines neural network and decision tree methods. The network is trained to remove irrelevant data attributes and the decision tree is applied to extract comprehensible classification rules from the trained network. The conditions of the rules are in the form of a conjunction of M-of-N constructs. An M-of-N construct is a rule condition that is satisfied if (at least, exactly, at most) M of the N binary attributes in the construct are present. The effectiveness of the method is illustrated on data collected for a study of global car market segmentation. The results show that besides achieving high predictive accuracy, the method also allows meaningful interpretation of the relationships among the data variables. | Source Title: | Journal of the Operational Research Society | URI: | http://scholarbank.nus.edu.sg/handle/10635/42882 | ISSN: | 01605682 | DOI: | 10.1057/palgrave.jors.2601807 |
Appears in Collections: | Staff Publications |
Show full item record
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.