Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/73583
Title: Marker-less computer vision tracking for agmented reality
Authors: Fong, W.T.
Ong, S.K. 
Nee, A.Y.C. 
Keywords: Augmented reality
Computer vision tracking
Illumination model
Non-linear optimization
Issue Date: 2010
Citation: Fong, W.T.,Ong, S.K.,Nee, A.Y.C. (2010). Marker-less computer vision tracking for agmented reality. Proc. of the IADIS Int. Conf. - Computer Graphics, Visualization, Computer Vision and Image Processing, CGVCVIP 2010, Visual Commun., VC 2010, Web3DW 2010, Part of the MCCSIS 2010 : 85-92. ScholarBank@NUS Repository.
Abstract: A real time marker-less computer vision tracker designed for Augmented Reality is presented. It obtains accurate camera ositions and orientations relative to well-patterned planar surfaces in real time, without the use of artificial markers. It elies on Efficient Second-order Minimization (ESM) of the pixel intensity errors over a large image region to perform rame-to-frame tracking. A sub-grid gradient-based reference image selection process and an illumination model are roposed to improve the robustness and efficiency of ESM tracking. The proposed illumination model allows ESM racking to robustly handle shadows, glares, general ambient light changes and partial occlusion, while increasing the ccuracy and processing speed. Evaluation using video sequences of tracking of real objects is presented. A featuretracking omponent is added to automatically initialize and recover ESM tracking. © 2010 IADIS.
Source Title: Proc. of the IADIS Int. Conf. - Computer Graphics, Visualization, Computer Vision and Image Processing, CGVCVIP 2010, Visual Commun., VC 2010, Web3DW 2010, Part of the MCCSIS 2010
URI: http://scholarbank.nus.edu.sg/handle/10635/73583
ISBN: 9789728939229
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.