Please use this identifier to cite or link to this item:
|Title:||A partially supervised cross-collection topic model for cross-domain text classification|
|Citation:||Bao, Y.,Collier, N.,Datta, A. (2013). A partially supervised cross-collection topic model for cross-domain text classification. International Conference on Information and Knowledge Management, Proceedings : 239-248. ScholarBank@NUS Repository. https://doi.org/10.1145/2505515.2505556|
|Abstract:||Cross-domain text classification aims to automatically train a precise text classifier for a target domain by using labelled text data from a related source domain. To this end, one of the most promising ideas is to induce a new feature representation so that the distributional difference between domains can be reduced and a more accurate classifier can be learned in this new feature space. However, most existing methods do not explore the duality of the marginal distribution of examples and the conditional distribution of class labels given labeled training examples in the source domain. Besides, few previous works attempt to explicitly distinguish the domain-independent and domain-specific latent features and align the domain-specific features to further improve the cross-domain learning. In this paper, we propose a model called Partially Supervised Cross-Collection LDA topic model (PSCCLDA) for cross-domain learning with the purpose of addressing these two issues in a unified way. Experimental results on nine datasets show that our model outperforms two standard classifiers and four state-of-the-art methods, which demonstrates the effectiveness of our proposed model. Copyright is held by the owner/author(s).|
|Source Title:||International Conference on Information and Knowledge Management, Proceedings|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Oct 15, 2018
checked on Oct 12, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.