Please use this identifier to cite or link to this item:
https://scholarbank.nus.edu.sg/handle/10635/71662
DC Field | Value | |
---|---|---|
dc.title | Robust approach towards text extraction from natural scene images captured via mobile devices | |
dc.contributor.author | Jian, Y. | |
dc.contributor.author | Kiong, T.K. | |
dc.contributor.author | Heng, L.T. | |
dc.date.accessioned | 2014-06-19T03:26:20Z | |
dc.date.available | 2014-06-19T03:26:20Z | |
dc.date.issued | 2009 | |
dc.identifier.citation | Jian, Y.,Kiong, T.K.,Heng, L.T. (2009). Robust approach towards text extraction from natural scene images captured via mobile devices. Proceedings of the IASTED International Conference on Modelling, Simulation, and Identification, MSI 2009. ScholarBank@NUS Repository. | |
dc.identifier.isbn | 9780889868106 | |
dc.identifier.uri | http://scholarbank.nus.edu.sg/handle/10635/71662 | |
dc.description.abstract | The paper presents the development of a human-machine interactive software application, especifically useful for text extraction and translation from images which are captured using mobile and digital devices with cameras. The effectiveness of the proposed algorithm in meeting the challenges behind the processing of such images will be highlighted with real images. In considering of the resource constraint nature of mobile devices, the proposed solution makes best possible choices to balance between recognition accuracy and processing speed. | |
dc.source | Scopus | |
dc.subject | OCR | |
dc.subject | Resource constraint | |
dc.subject | Text extraction | |
dc.subject | Translation | |
dc.type | Conference Paper | |
dc.contributor.department | ELECTRICAL & COMPUTER ENGINEERING | |
dc.description.sourcetitle | Proceedings of the IASTED International Conference on Modelling, Simulation, and Identification, MSI 2009 | |
dc.identifier.isiut | NOT_IN_WOS | |
Appears in Collections: | Staff Publications |
Show simple item record
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.