Please use this identifier to cite or link to this item:
|Title:||Knowledge acquisition and revision using neural networks: An application to a cross-national study of brand image perception|
|Authors:||Setiono, R. |
|Keywords:||Global brand image perceptions|
|Source:||Setiono, R., Pan, S.-L., Hsieh, M.-H., Azcarraga, A. (2006). Knowledge acquisition and revision using neural networks: An application to a cross-national study of brand image perception. Journal of the Operational Research Society 57 (3) : 231-240. ScholarBank@NUS Repository. https://doi.org/10.1057/palgrave.jors.2602006|
|Abstract:||A three-tier knowledge management approach is proposed in the context of a cross-national study of car brand and corporate image perceptions. The approach consists of knowledge acquisition, transfer and revision using neural networks. We investigate how knowledge acquired by a neural network from one car market can be exploited and applied in another market. This transferred knowledge is subsequently revised for application in the new market. Knowledge revision is achieved by re-training the neural network. Core knowledge common to both markets is retained while some localized knowledge components are introduced during network re-training. Since the knowledge acquired by a neural network can be expressed as an accurate set of simple rules, we are able to compare the knowledge extracted from one network with the knowledge extracted from another. Comparison of the originally acquired knowledge with the revised knowledge provides us with insights into the commonalities and differences in car brand and corporate perceptions across national markets. © 2006 Operational Research Society Ltd. All rights reserved.|
|Source Title:||Journal of the Operational Research Society|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Dec 7, 2017
WEB OF SCIENCETM
checked on Nov 29, 2017
checked on Dec 11, 2017
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.