Please use this identifier to cite or link to this item:
|Title:||Crowdsourced automatic zoom and scroll for video retargeting|
|Authors:||Carlier, A. |
|Keywords:||automatic zoom and pan|
|Citation:||Carlier, A.,Charvillat, V.,Ooi, W.T.,Grigoras, R.,Morin, G. (2010). Crowdsourced automatic zoom and scroll for video retargeting. MM'10 - Proceedings of the ACM Multimedia 2010 International Conference : 201-210. ScholarBank@NUS Repository. https://doi.org/10.1145/1873951.1873962|
|Abstract:||Screen size and display resolution limit the experience of watching videos on mobile devices. The viewing experience can be improved by determining important or interesting regions within the video (called regions of interest, or ROIs) and displaying only the ROIs to the viewer. Previous work focuses on analyzing the video content using visual attention model to infer the ROIs. Such content-based technique, however, has limitations. In this paper, we propose an alternative paradigm to infer ROIs from a video. We crowdsource from a large number of users through their implicit viewing behavior using a zoom and pan interface, and infer the ROIs from their collective wisdom. A retargeted video, consisting of relevant shots determined from historical users behavior, can be automatically generated and replayed to subsequent users who would prefer a less interactive viewing experience. This paper presents how we collect the user traces, infer the ROIs and their dynamics, group the ROIs into shots, and automatically reframe those shots to improve the aesthetics of the video. A user study with 48 participants shows that our automatically retargeted video is of comparable quality to one handcrafted by an expert user © 2010 ACM.|
|Source Title:||MM'10 - Proceedings of the ACM Multimedia 2010 International Conference|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Jan 16, 2019
checked on Dec 29, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.