Please use this identifier to cite or link to this item:
|Title:||Robust Non-negative Graph Embedding: Towards noisy data, unreliable graphs, and noisy labels|
|Citation:||Zhang, H.,Zha, Z.-J.,Yan, S.,Wang, M.,Chua, T.-S. (2012). Robust Non-negative Graph Embedding: Towards noisy data, unreliable graphs, and noisy labels. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition : 2464-2471. ScholarBank@NUS Repository. https://doi.org/10.1109/CVPR.2012.6247961|
|Abstract:||Non-negative data factorization has been widely used recently. However, existing techniques, such as Non-negative Graph Embedding (NGE), often suffer from noisy data, unreliable graphs, and noisy labels, which are commonly encountered in real-world applications. To address these issues, in this paper, we propose a Robust Non-negative Graph Embedding (RNGE) framework. The joint sparsity in both graph embedding and reconstruction endues the robustness of RNGE. We develop an elegant multiplicative updating solution that can solve RNGE efficiently and prove the convergence rigourously. RNGE is robust to unreliable graphs, as well as both sample and label noises in training data. Moreover, RNGE provides a general formulation such that all the algorithms unified with the graph embedding framework can be easily extended to obtain their robust non-negative solutions. We conduct extensive experiments on four real-world datasets and compared the proposed RNGE to NGE and other representative non-negative data factorization and subspace learning methods. The experimental results demonstrate the effectiveness and robustness of RNGE. © 2012 IEEE.|
|Source Title:||Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Dec 9, 2018
checked on Dec 8, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.