Please use this identifier to cite or link to this item:
|Title:||Performance analysis of visual tracking algorithms for motion-based user interfaces on mobile devices||Authors:||Winkler, S.
|Issue Date:||2008||Citation:||Winkler, S., Rangaswamy, K., Tedjokusumo, J., Zhou, Z. (2008). Performance analysis of visual tracking algorithms for motion-based user interfaces on mobile devices. Proceedings of SPIE - The International Society for Optical Engineering 6821 : -. ScholarBank@NUS Repository. https://doi.org/10.1117/12.766242||Abstract:||Determining the self-motion of a camera is useful for many applications. A number of visual motion-tracking algorithms have been developed till date, each with their own advantages and restrictions. Some of them have also made their foray into the mobile world, powering augmented reality-based applications on phones with inbuilt cameras. In this paper, we compare the performances of three feature or landmark-guided motion tracking algorithms, namely marker-based tracking with MXRToolkit, face tracking based on CamShift, and MonoSLAM. We analyze and compare the complexity, accuracy, sensitivity, robustness and restrictions of each of the above methods. Our performance tests are conducted over two stages: The first stage of testing uses video sequences created with simulated camera movements along the six degrees of freedom in order to compare accuracy in tracking, while the second stage analyzes the robustness of the algorithms by testing for manipulative factors like image scaling and frame-skipping. © 2008 SPIE-IS&T.||Source Title:||Proceedings of SPIE - The International Society for Optical Engineering||URI:||http://scholarbank.nus.edu.sg/handle/10635/71398||ISBN:||9780819469939||ISSN:||0277786X||DOI:||10.1117/12.766242|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Mar 30, 2020
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.