Please use this identifier to cite or link to this item:
|Title:||Visual registration for unprepared augmented reality environments|
|Authors:||Xu, K. |
Vision based tracking
|Citation:||Xu, K., Prince, S.J.D., Cheok, A.D., Qiu, Y., Kumar, K.G. (2003). Visual registration for unprepared augmented reality environments. Personal and Ubiquitous Computing 7 (5) : 287-298. ScholarBank@NUS Repository. https://doi.org/10.1007/s00779-003-0241-z|
|Abstract:||Despite the increasing sophistication of augmented reality (AR) tracking technology, tracking in unprepared environments still remains an enormous challenge according to a recent survey. Most current systems are based on a calculation of the optical flow between the current and previous frames to adjust the label position. Here we present two alternative algorithms based on geometrical image constraints. The first is based on epipolar geometry and provides a general description of the constraints on image flow between two static scenes. The second is based on the calculation of a homography relationship between the current frame and a stored representation of the scene. A homography can exactly describe the image motion when the scene is planar, or when the camera movement is a pure rotation, and provides a good approximation when these conditions are nearly met. We assess all three styles of algorithms with a number of criteria including robustness, speed and accuracy. We demonstrate two real-time AR systems here, which are based on the estimation of homography. One is an outdoor geographical labelling/ overlaying system, and the other is an AR Pacman game application. © 2003 Springer-Verlag London Limited.|
|Source Title:||Personal and Ubiquitous Computing|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Mar 23, 2019
WEB OF SCIENCETM
checked on Mar 13, 2019
checked on Mar 2, 2019
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.