Please use this identifier to cite or link to this item: https://doi.org/10.1109/IROS.2008.4650955
Title: Robust extraction of shady roads for vision-based UGV navigation
Authors: Dong-Si, T.-C.
Guo, D.
Yan, C.H. 
Ong, S.H. 
Issue Date: 2008
Citation: Dong-Si, T.-C., Guo, D., Yan, C.H., Ong, S.H. (2008). Robust extraction of shady roads for vision-based UGV navigation. 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS : 3140-3145. ScholarBank@NUS Repository. https://doi.org/10.1109/IROS.2008.4650955
Abstract: This paper addresses the problem of extracting the road region in different driving environments with dynamic lighting changes. Previous approaches using Gaussian mixture models (GMM) have fixed number of models constructed from sample color data and could not keep models associated with shadows. As a result, although they work in some specific environments, they fail in other environments or in scenes with shadows. In this paper, we propose a new vision-based approach where flexible number of models are built from sample data. Those color samples are reliably collected from stereo-verified ground patches inside a pre-defined trapezoidal learning region. After model construction, models associated with shadows and highlights are detected and maintained. The advantages of this approach with respect to other techniques are that it gives more robust results and, in particular, recognizes shadows on road as drivable road surface instead of non-road. ©2008 IEEE.
Source Title: 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS
URI: http://scholarbank.nus.edu.sg/handle/10635/71674
ISBN: 9781424420582
DOI: 10.1109/IROS.2008.4650955
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

SCOPUSTM   
Citations

8
checked on Nov 14, 2018

WEB OF SCIENCETM
Citations

3
checked on Nov 6, 2018

Page view(s)

29
checked on Oct 20, 2018

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.