Please use this identifier to cite or link to this item:
https://scholarbank.nus.edu.sg/handle/10635/40791
DC Field | Value | |
---|---|---|
dc.title | An adaptive sampling method for layered depth image | |
dc.contributor.author | Namboori, R. | |
dc.contributor.author | Teh, H.C. | |
dc.contributor.author | HUANG ZHIYONG | |
dc.date.accessioned | 2013-07-04T08:12:25Z | |
dc.date.available | 2013-07-04T08:12:25Z | |
dc.date.issued | 2004 | |
dc.identifier.citation | Namboori, R., Teh, H.C., HUANG ZHIYONG (2004). An adaptive sampling method for layered depth image. Proceedings of Computer Graphics International Conference, CGI : 206-213. ScholarBank@NUS Repository. | |
dc.identifier.issn | 15301052 | |
dc.identifier.uri | http://scholarbank.nus.edu.sg/handle/10635/40791 | |
dc.description.abstract | Sampling issue is an important problem in image based rendering. In this paper, we propose an adaptive sampling method to improve the Layered Depth Image framework. Different from the existing methods of interpolating or splatting neighboring pixels, our method selects a set of sampling views based on the scene analysis that can guarantee the final rendering quality. Furthermore, the rendering speed is accelerated by the pre-computed patch lookup table, which simplifies the reference view selection process to a simple lookup of a hash table. We have implemented our method. The experiment study shows the advantage of the method. | |
dc.source | Scopus | |
dc.subject | Data sampling | |
dc.subject | Image based rendering | |
dc.subject | Image warping | |
dc.subject | Layer depth images | |
dc.type | Conference Paper | |
dc.contributor.department | COMPUTER SCIENCE | |
dc.description.sourcetitle | Proceedings of Computer Graphics International Conference, CGI | |
dc.description.page | 206-213 | |
dc.identifier.isiut | NOT_IN_WOS | |
Appears in Collections: | Staff Publications |
Show simple item record
Files in This Item:
There are no files associated with this item.
Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.