Please use this identifier to cite or link to this item:
Title: Real-time camera tracking for marker-less and unprepared augmented reality environments
Authors: Xu, K. 
Chia, K.W.
Cheok, A.D. 
Keywords: Augmented reality
Fundamental matrix
Optical flow
Vision based tracking
Issue Date: 1-May-2008
Source: Xu, K., Chia, K.W., Cheok, A.D. (2008-05-01). Real-time camera tracking for marker-less and unprepared augmented reality environments. Image and Vision Computing 26 (5) : 673-689. ScholarBank@NUS Repository.
Abstract: For three-dimensional video-based augmented reality applications, accurate measurements of the 6DOF camera pose relative to the real world are required for proper registration of the virtual objects. This paper presents an accurate and robust system for real-time 6DOF camera pose tracking based on natural features in an arbitrary scene. Crucially, the calculation is based on pre-captured reference images. This prevents a gradual increase in the camera position error. Point features in the current image frame are first matched to two spatially separated reference images. This wide baseline correspondence problem is overcome by constructing (1) a global homography between current and previous image frame and (2) local affine transforms derived from known matches between previous frame and reference images. Chaining these two mappings constrains the search for potential matches in the reference images and allows the warping of corner intensity neighborhoods so that a viewpoint invariant similarity measure for assessing potential point matches can be defined. We then minimize deviations from the two-view and three-view constraints between the reference images and current frame as a function of the camera motion parameters to obtain an estimate of the current camera pose relative to the reference images. This calculation is stabilized using a recursive form of temporal regularization similar in spirit to the Kalman filter. We can track camera pose reliably over hundreds of image frames and realistically integrate three-dimensional virtual objects with only slight jitter. This paper also tries to simplify the above described algorithm and present a real-time, robust tracking system based on computing homographies. Homography can exactly describe the image motion between two frames when the camera motion is pure rotation, or it is viewing a planar scene. For outdoor registration applications, the system is robust under small translations as long as the majority of the scene contents are distant. © 2008.
Source Title: Image and Vision Computing
ISSN: 02628856
DOI: 10.1016/j.imavis.2007.08.015
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.


checked on Feb 27, 2018


checked on Feb 19, 2018

Page view(s)

checked on Mar 12, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.