Please use this identifier to cite or link to this item:
|Title:||Analogies based video editing|
|Authors:||Yan, W.-Q. |
Automatic video editing
|Source:||Yan, W.-Q., Kankanhalli, M.S., Wang, J. (2005). Analogies based video editing. Multimedia Systems 11 (1) : 3-18. ScholarBank@NUS Repository. https://doi.org/10.1007/s00530-005-0186-3|
|Abstract:||A well-produced video always creates a strong impression on the viewer. However, due to the limitations of the camera, the ambient conditions or the skills of the videographer, the quality of captured videos sometimes falls short of one's expectations. On the other hand, we have a vast amount of superbly captured videos available on the web and in digital libraries. In this paper, we propose the novel approach of video analogies that provides a powerful ability to improve the quality of a video by borrowing features from a higher quality video. We want to improve the given target video in order to obtain a higher quality output video. During the matching phase, we find the correspondence between the pair by using feature matching. Then for the target video, we utilize this correspondence to transfer some desired traits of the source video into the target video in order to obtain a new video. Thus, the new video will obtain the desired features from the source video while retaining the merits of the target video. The video analogies technique provides an intuitive mechanism for automatic editing of videos. We demonstrate the utility of the analogies method by considering three applications - colorizing videos, reducing video blurs, and video rhythm adjustment. We describe each application in detail and provide experimental results to establish the efficacy of the proposed approach.|
|Source Title:||Multimedia Systems|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Mar 7, 2018
WEB OF SCIENCETM
checked on Jan 31, 2018
checked on Mar 11, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.