Please use this identifier to cite or link to this item:
https://doi.org/10.1109/TPAMI.2010.222
Title: | Richardson-Lucy deblurring for scenes under a projective motion path | Authors: | Tai, Y.-W. Tan, P. Brown, M.S. |
Keywords: | Motion deblurring spatially verying motion blur |
Issue Date: | 2011 | Citation: | Tai, Y.-W., Tan, P., Brown, M.S. (2011). Richardson-Lucy deblurring for scenes under a projective motion path. IEEE Transactions on Pattern Analysis and Machine Intelligence 33 (8) : 1603-1618. ScholarBank@NUS Repository. https://doi.org/10.1109/TPAMI.2010.222 | Abstract: | This paper addresses how to model and correct image blur that arises when a camera undergoes ego motion while observing a distant scene. In particular, we discuss how the blurred image can be modeled as an integration of the clear scene under a sequence of planar projective transformations (i.e., homographies) that describe the camera's path. This projective motion path blur model is more effective at modeling the spatially varying motion blur exhibited by ego motion than conventional methods based on space-invariant blur kernels. To correct the blurred image, we describe how to modify the Richardson-Lucy (RL) algorithm to incorporate this new blur model. In addition, we show that our projective motion RL algorithm can incorporate state-of-the-art regularization priors to improve the deblurred results. The projective motion path blur model, along with the modified RL algorithm, is detailed, together with experimental results demonstrating its overall effectiveness. Statistical analysis on the algorithm's convergence properties and robustness to noise is also provided. © 2011 IEEE. | Source Title: | IEEE Transactions on Pattern Analysis and Machine Intelligence | URI: | http://scholarbank.nus.edu.sg/handle/10635/43130 | ISSN: | 01628828 | DOI: | 10.1109/TPAMI.2010.222 |
Appears in Collections: | Staff Publications |
Show full item record
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.