Please use this identifier to cite or link to this item:
|Title:||Learning to integrate web taxonomies|
|Authors:||Zhang, D. |
|Citation:||Zhang, D.,Lee, W.S. (2004). Learning to integrate web taxonomies. Web Semantics 2 (2) : 131-151. ScholarBank@NUS Repository. https://doi.org/10.1016/j.websem.2004.10.001|
|Abstract:||We investigate machine learning methods for automatically integrating objects from different taxonomies into a master taxonomy. This problem is not only currently pervasive on the Web, but is also important to the emerging Semantic Web. A straightforward approach to automating this process would be to build classifiers through machine learning and then use these classifiers to classify objects from the source taxonomies into categories of the master taxonomy. However, conventional machine learning algorithms totally ignore the availability of the source taxonomies. In fact, source and master taxonomies often have common categories under different names or other more complex semantic overlaps. We introduce two techniques that exploit the semantic overlap between the source and master taxonomies to build better classifiers for the master taxonomy. The first technique, Cluster Shrinkage, biases the learning algorithm against splitting source categories by making objects in the same category appear more similar to each other. The second technique, Co-Bootstrapping, tries to facilitate the exploitation of inter-taxonomy relationships by providing category indicator functions as additional features for the objects. Our experiments with real-world Web data show that these proposed add-on techniques can enhance various machine learning algorithms to achieve substantial improvements in performance for taxonomy integration. © 2004 Elsevier B.V. All rights reserved.|
|Source Title:||Web Semantics|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Dec 9, 2018
checked on May 26, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.