Please use this identifier to cite or link to this item: https://doi.org/10.1109/ICASSP.2008.4518364
Title: Discriminant simplex analysis
Authors: Fu, Y.
Yan, S. 
Huang, T.S.
Keywords: Discriminant simplex analysis
Graph embedding
k-nearest-neighbor simplex
Subspace learning
Issue Date: 2008
Source: Fu, Y.,Yan, S.,Huang, T.S. (2008). Discriminant simplex analysis. ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings : 3333-3336. ScholarBank@NUS Repository. https://doi.org/10.1109/ICASSP.2008.4518364
Abstract: Image representation and distance metric are both significant for learning-based visual classification. This paper presents the concept of κ-Nearest-Neighbor Simplex (κNNS), which is a simplex with the vertices as the κ nearest neighbors of a certain point. κNNS contributes to the image classification problem in two aspects. First, a novel distance metric between a point to its κNNS within a certain class is provided for general classification problem. Second, we develop a new subspace learning algorithm, called Discriminant Simplex Analysis (DSA), to pursue effective feature representation for image classification. In DSA, the within-locality and between-locality are both modeled by κNNS distance, which provides a more accurate and robust measurement of the probability of a point belonging to a certain class. Experiments on real-world image classification demonstrate the effectiveness of both DSA as well as κNNS based classification approach. ©2008 IEEE.
Source Title: ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
URI: http://scholarbank.nus.edu.sg/handle/10635/69978
ISBN: 1424414849
ISSN: 15206149
DOI: 10.1109/ICASSP.2008.4518364
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

SCOPUSTM   
Citations

4
checked on Dec 11, 2017

Page view(s)

16
checked on Dec 16, 2017

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.