Please use this identifier to cite or link to this item: http://scholarbank.nus.edu.sg/handle/10635/40905
Title: A new Iterative-Midpoint-Method for video character gap filling
Authors: Shivakumara, P. 
Bei Hong, D.
Zhao, D.
Tan, C.L. 
Pal, U.
Keywords: Character gap filling
Video character recognition
Video document analysis
Issue Date: 2012
Source: Shivakumara, P.,Bei Hong, D.,Zhao, D.,Tan, C.L.,Pal, U. (2012). A new Iterative-Midpoint-Method for video character gap filling. Proceedings - International Conference on Pattern Recognition : 673-676. ScholarBank@NUS Repository.
Abstract: We propose a new Iterative-Midpoint-Method (IMM) for video character gap filling based on end pixels and neighbor pixels in the extracted contour of a character. The method obtains the Enhanced Gradient Image (EGI) for the given gray character image to sharpen text pixels. Max-Min clustering and K-means clustering algorithm with K=2 are applied on the EGI to obtain text candidates. To clean up the background information, the intersection of the text candidate image and the Sobel of the input image is considered. The method extracts edges in Canny of the input image corresponding to pixel in the intersection results, which we call the potential candidates having possible contour of the character with fewer disconnections. From the contour, we identify the correct pair of end pixels based on mutual nearest neighbor criteria. The three midpoints obtained from the two end pixels and their two preceded pixels are noted. The distance between three consecutive midpoints is used to predict a new midpoint. From the new midpoint, the method recursively computes midpoints till it reaches end pixels, which results in updated new end pixels. In this way, the method repeats midpoint computation iteratively to fill the complete gap between two end pixels. The method has been tested on 500 images which include 200 character images from video, 200 character images from ICDAR-2003 competition data and 100 images from object data to evaluate the performance. The comparative study shows that the proposed method is superior to a baseline method in terms of recognition rate. © 2012 ICPR Org Committee.
Source Title: Proceedings - International Conference on Pattern Recognition
URI: http://scholarbank.nus.edu.sg/handle/10635/40905
ISBN: 9784990644109
ISSN: 10514651
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Page view(s)

56
checked on Jan 20, 2018

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.