JOURNAL ARTICLE

Edge-based method for text detection from complex document images

Abstract

Detection of text from documents in which text is embedded in complex colored and textured backgrounds is a very challenging problem. In this paper, we propose a simple texture-based approach based on edge information for this task. The performance of our method is compared to that obtained by a method based on the discrete cosine transform which was recently proposed by Y. Zhong et al. (2000) for text localization in compressed digital video. In our experiments, both methods performed about equally well for small-sized text, but our method was better in the case of large-sized text. The principal advantage of our approach is that in addition to the text detection problem, the same edge representation can also be used for other image interpretation tasks.

Keywords:
Computer science Artificial intelligence Edge detection Enhanced Data Rates for GSM Evolution Representation (politics) Task (project management) Discrete cosine transform Simple (philosophy) Pattern recognition (psychology) Computer vision Image (mathematics) Interpretation (philosophy) Text detection Image processing

Metrics

33
Cited By
1.14
FWCI (Field Weighted Citation Impact)
14
Refs
0.78
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Handwritten Text Recognition Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Image Retrieval and Classification Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Image and Video Retrieval Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

JOURNAL ARTICLE

New method for text detection and segmentation from complex images

Fang LiuXiang PengTianjiang Wang

Journal:   Proceedings of SPIE, the International Society for Optical Engineering/Proceedings of SPIE Year: 2007 Vol: 6786 Pages: 67863F-67863F
© 2026 ScienceGate Book Chapters — All rights reserved.