Please use this identifier to cite or link to this item:
|Title:||Improving PMVS algorithm for 3D scene reconstruction from sparse stereo pairs||Authors:||Li, B.
|Issue Date:||2013||Citation:||Li, B.,Venkatesh, Y.V.,Kassim, A.,Lu, Y. (2013). Improving PMVS algorithm for 3D scene reconstruction from sparse stereo pairs. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 8294 LNCS : 221-232. ScholarBank@NUS Repository. https://doi.org/10.1007/978-3-319-03731-8_21||Abstract:||3D scene reconstruction resulting from a limited number of stereo pairs captured by a 3D camera is a nontrivial and challenging task even for current state-of-the-art multi-view stereo (MVS) reconstruction algorithms. It also has many application potentials in related techniques, such as robotics, virtual reality, video games, and 3D animation. In this paper, we analyze the performance of the PMVS (Patch-based Multi- View Stereo software) for scene reconstruction from stereo pairs of scenes captured by a simple 3D camera. We demonstrate that when applied to a limited number of stereo pairs, PMVS is inadequate for 3D scene reconstruction and discuss new strategies to overcome these limitations to improve 3D reconstruction. The proposed Canny edge feature-based PMVS algorithm is shown to produce better reconstruction results. We also discuss further enhancements using dense feature matching and disparity map-based stereo reconstruction. © Springer International Publishing Switzerland 2013.||Source Title:||Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)||URI:||http://scholarbank.nus.edu.sg/handle/10635/83836||ISBN:||9783319037301||ISSN:||16113349||DOI:||10.1007/978-3-319-03731-8_21|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on May 22, 2019
checked on May 21, 2019
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.