Please use this identifier to cite or link to this item: https://doi.org/10.1109/ISMAR.2010.5643567
DC FieldValue
dc.titlePositioning, tracking and mapping for outdoor augmentation
dc.contributor.authorKarlekar, J.
dc.contributor.authorZhou, S.Z.
dc.contributor.authorLu, W.
dc.contributor.authorLoh, Z.C.
dc.contributor.authorNakayama, Y.
dc.contributor.authorHii, D.
dc.date.accessioned2014-06-19T03:24:11Z
dc.date.available2014-06-19T03:24:11Z
dc.date.issued2010
dc.identifier.citationKarlekar, J.,Zhou, S.Z.,Lu, W.,Loh, Z.C.,Nakayama, Y.,Hii, D. (2010). Positioning, tracking and mapping for outdoor augmentation. 9th IEEE International Symposium on Mixed and Augmented Reality 2010: Science and Technology, ISMAR 2010 - Proceedings : 175-184. ScholarBank@NUS Repository. <a href="https://doi.org/10.1109/ISMAR.2010.5643567" target="_blank">https://doi.org/10.1109/ISMAR.2010.5643567</a>
dc.identifier.isbn9781424493449
dc.identifier.urihttp://scholarbank.nus.edu.sg/handle/10635/71477
dc.description.abstractThis paper presents a novel approach for user positioning, robust tracking and online 3D mapping for outdoor augmented reality applications. As coarse user pose obtained from GPS and orientation sensors is not sufficient for augmented reality applications, sub-meter accurate user pose is then estimated by a one-step silhouette matching approach. Silhouette matching of the rendered 3D model and camera data is carried out with shape context descriptors as they are invariant to translation, scale and rotational errors, giving rise to a non-iterative registration approach. Once the user is correctly positioned, further tracking is carried out with camera data alone. Drifts associated with vision based approaches are minimized by combining different feature modalities. Robust visual tracking is maintained by fusing frame-to-frame and model-to-frame feature matches. Frame-to-frame tracking is accomplished with corner matching while edges are used for model-to-frame registration. Results from individual feature tracker are fused using a pose estimate obtained from an extended Kalman filter (EKF) and a weighted M-estimator. In scenarios where dense 3D models of the environment are not available, online 3D incremental mapping and tracking is proposed to track the user in unprepared environments. Incremental mapping prepares the 3D point cloud of the outdoor environment for tracking. ©2010 IEEE.
dc.description.urihttp://libproxy1.nus.edu.sg/login?url=http://dx.doi.org/10.1109/ISMAR.2010.5643567
dc.sourceScopus
dc.subject3d mapping
dc.subjectAugmented reality
dc.subjectH.5.1 [information systems]: multimedia information systems - augmented reality; I.4.8 [image processing and computer vision]:
dc.subjectRobust tracking
dc.subjectScene analysis - sensor fusion, tracking
dc.subjectSensor fusion
dc.subjectShape matching
dc.subjectUser positioning
dc.typeConference Paper
dc.contributor.departmentELECTRICAL & COMPUTER ENGINEERING
dc.contributor.departmentDEAN'S OFFICE (SCHOOL OF DESIGN & ENV)
dc.description.doi10.1109/ISMAR.2010.5643567
dc.description.sourcetitle9th IEEE International Symposium on Mixed and Augmented Reality 2010: Science and Technology, ISMAR 2010 - Proceedings
dc.description.page175-184
dc.identifier.isiutNOT_IN_WOS
Appears in Collections:Staff Publications

Show simple item record
Files in This Item:
There are no files associated with this item.

SCOPUSTM   
Citations

21
checked on Jan 20, 2020

Page view(s)

113
checked on Dec 29, 2019

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.