Please use this identifier to cite or link to this item: https://doi.org/10.1109/ICIP.2013.6738628
Title: Learning boundaries with color and depth
Authors: Jia Z.
Gallagher A.
Chen T. 
Keywords: Image edge detection
Image segmentation
Markov random fields
Issue Date: 2013
Citation: Jia Z., Gallagher A., Chen T. (2013). Learning boundaries with color and depth. 2013 IEEE International Conference on Image Processing, ICIP 2013 - Proceedings : 3049-3053. ScholarBank@NUS Repository. https://doi.org/10.1109/ICIP.2013.6738628
Abstract: To enable high-level understanding of a scene, it is important to understand the occlusion and connected boundaries of objects in the image. In this paper, we propose a new framework for inferring boundaries from color and depth information. Even with depth information, it is not a trivial task to find and classify boundaries. Real-world depth images are noisy, especially at object boundaries, where our task is focused. Our approach uses features from both the color (which are sharp at object boundaries) and depth images (for providing geometric cues) to detect boundaries and classify them as occlusion or connected boundaries. We propose depth features based on surface fitting from sparse point clouds, and perform inference with a Conditional Random Field. One advantage of our approach is that occlusion and connected boundaries are identified with a single, common model. Experiments show that our mid-level color and depth features outperform using either depth or color alone, and our method surpasses the performance of baseline boundary detection methods.
Source Title: 2013 IEEE International Conference on Image Processing, ICIP 2013 - Proceedings
URI: http://scholarbank.nus.edu.sg/handle/10635/146096
ISBN: 9781479923410
DOI: 10.1109/ICIP.2013.6738628
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.