Please use this identifier to cite or link to this item:
|Title:||A new texture descriptor using multifractal analysis in multi-orientation wavelet pyramid|
|Source:||Xu, Y., Yang, X., Ling, H., Ji, H. (2010). A new texture descriptor using multifractal analysis in multi-orientation wavelet pyramid. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition : 161-168. ScholarBank@NUS Repository. https://doi.org/10.1109/CVPR.2010.5540217|
|Abstract:||Based on multifractal analysis in wavelet pyramids of texture images, a new texture descriptor is proposed in this paper that implicitly combines information from both spatial and frequency domains. Beyond the traditional wavelet transform, a multi-oriented wavelet leader pyramid is used in our approach that robustly encodes the multi-scale information of texture edgels. Moreover, the resulting texture model shows empirically a strong power law relationship for nature textures, which can be characterized well by multifractal analysis. Combined with a statistics on affine invariant local patches, our proposed texture descriptor is robust to scale and rotation changes, more general geometrical transforms and illumination variations. In addition, the proposed texture descriptor is computationally efficient since it does not require many expensive processing steps, e.g., texton generation and cross-bin comparisons, which are often used by existing methods. As an application, the proposed descriptor is applied to texture classification and the experimental results on several public texture datasets verified the accuracy and efficiency of our descriptor. ©2010 IEEE.|
|Source Title:||Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Jan 17, 2018
WEB OF SCIENCETM
checked on Jan 8, 2018
checked on Jan 22, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.