Please use this identifier to cite or link to this item:
|Title:||Markerless augmented reality using a robust point transferring method||Authors:||Ong, S.K.
|Issue Date:||2007||Citation:||Ong, S.K.,Yuan, M.L.,Nee, A.Y.C. (2007). Markerless augmented reality using a robust point transferring method. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 4352 LNCS (PART 2) : 258-268. ScholarBank@NUS Repository. https://doi.org/10.1007/978-3-540-69429-8_26||Abstract:||This paper proposes a robust point transferring method for markerless AR applications. Using this method, any points specified at the initialization stage can be stably transferred during the augmentation process. These transferred points can be used for registration, annotation and video augmentation in markerless AR applications. This proposed point transferring method is based on a simple nonlinear optimization model. The proposed method has several advantages. Firstly, it is robust and stable as it remains effective when the camera is moved about quickly or when the scenes are largely occluded or filled with moving objects. Second, it is simple as the points that will be used for registration, annotation and video augmentation are only required to be specified in one image. Lastly, it is fast as the proposed simple optimization model can be solved quickly. Several experiments have been conducted to validate the performance of this proposed method. © Springer-Verlag Berlin Heidelberg 2007.||Source Title:||Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)||URI:||http://scholarbank.nus.edu.sg/handle/10635/73582||ISBN:||9783540694281||ISSN:||03029743||DOI:||10.1007/978-3-540-69429-8_26|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Nov 12, 2019
checked on Oct 27, 2019
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.