Please use this identifier to cite or link to this item:
Title: In-video product annotation with web information mining
Authors: Li, G.
Wang, M.
Lu, Z.
Hong, R.
Chua, T.-S. 
Keywords: Product Annotation
Video search
Web mining
Issue Date: 2012
Citation: Li, G., Wang, M., Lu, Z., Hong, R., Chua, T.-S. (2012). In-video product annotation with web information mining. ACM Transactions on Multimedia Computing, Communications and Applications 8 (4). ScholarBank@NUS Repository.
Abstract: Product annotation in videos is of great importance for video browsing, search, and advertisement. However, most of the existing automatic video annotation research focuses on the annotation of high-level concepts, such as events, scenes, and object categories. This article presents a novel solution to the annotation of specific products in videos by mining information from the Web. It collects a set of high-quality training data for each product by simultaneously leveraging Amazon and Google image search engine. A visual signature for each product is then built based on the bag-of-visual-words representation of the training images. A correlative sparsification approach is employed to remove noisy bins in the visual signatures. These signatures are used to annotate video frames. We conduct experiments on more than 1,000 videos and the results demonstrate the feasibility and effectiveness of our approach.
Source Title: ACM Transactions on Multimedia Computing, Communications and Applications
ISSN: 15516857
DOI: 10.1145/2379790.2379797
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.


checked on Mar 16, 2019


checked on Feb 27, 2019

Page view(s)

checked on Mar 17, 2019

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.