Please use this identifier to cite or link to this item:
|Title:||Learning by propagability|
|Source:||Ni, B., Yan, S., Kassim, A., Cheong, L.F. (2008). Learning by propagability. Proceedings - IEEE International Conference on Data Mining, ICDM : 492-501. ScholarBank@NUS Repository. https://doi.org/10.1109/ICDM.2008.53|
|Abstract:||In this paper, we present a novel feature extraction framework, called learning by propagability. The whole learning process is driven by the philosophy that the data labels and optimal feature representation can constitute a harmonic system, namely, the data labels are invariant with respect to the propagation on the similarity-graph constructed by the optimal feature representation. Based on this philosophy, a unified formulation for learning by propagability is proposed for both supervised and semisupervised configurations. Specifically, this formulation offers the semi-supervised learning two characteristics: 1) unlike conventional semi-supervised learning algorithms which mostly include at least two parameters, this formulation is parameter-free; and 2) the formulation unifies the label propagation and optimal representation pursuing, and thus the label propagation is enhanced by benefiting from the graph constructed with the derived optimal representation instead of the original representation. Extensive experiments on UCI toy data, handwritten digit recognition, and face recognition all validate the effectiveness of our proposed learning framework compared with the state-ofthe- art methods for feature extraction and semi-supervised learning. © 2008 IEEE.|
|Source Title:||Proceedings - IEEE International Conference on Data Mining, ICDM|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Dec 14, 2017
WEB OF SCIENCETM
checked on Nov 20, 2017
checked on Dec 10, 2017
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.