Please use this identifier to cite or link to this item: https://doi.org/10.1109/ICDAR.2009.85
Title: A gradient difference based technique for video text detection
Authors: Shivakumara, P. 
Phan, T.Q. 
Tan, C.L. 
Issue Date: 2009
Source: Shivakumara, P.,Phan, T.Q.,Tan, C.L. (2009). A gradient difference based technique for video text detection. Proceedings of the International Conference on Document Analysis and Recognition, ICDAR : 156-160. ScholarBank@NUS Repository. https://doi.org/10.1109/ICDAR.2009.85
Abstract: Text detection in video images has received increasing attention, particularly in scene text detection in video images, as it plays a vital role in video indexing and information retrieval. This paper proposes a new and robust gradient difference technique for detecting both graphics and scene text in video images. The technique introduces the concept of zero crossing to determine the bounding boxes for the detected text lines in video images, rather than using the conventional projection profiles based method which fails to fix bounding boxes when there is no proper spacing between the detected text lines. We demonstrate the capability of the proposed technique by conducting experiments on video images containing both graphics text and scene text with different font shapes and sizes, languages, text directions, background and contrasts. Our experimental results show that the proposed technique outperforms existing methods in terms of detection rate for large video image database. © 2009 IEEE.
Source Title: Proceedings of the International Conference on Document Analysis and Recognition, ICDAR
URI: http://scholarbank.nus.edu.sg/handle/10635/42053
ISBN: 9780769537252
ISSN: 15205363
DOI: 10.1109/ICDAR.2009.85
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

SCOPUSTM   
Citations

35
checked on Dec 18, 2017

Page view(s)

71
checked on Dec 16, 2017

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.