2015
DOI: 10.1142/s0218001415530055
|View full text |Cite
|
Sign up to set email alerts
|

Multiscale Region Projection Method to Discriminate Between Printed and Handwritten Text on Registration Forms

Abstract: Techniques to identify printed and handwritten text in scanned documents di®er signi¯cantly. In this paper, we address the question of how to discriminate between each type of writing on registration forms. Registration-form documents consist of various type zones, such as printed text, handwriting, table, image, noise, etc., so segmenting the various zones is a challenge. We adopt herein an approach called \ multiscale-region projection" to identify printed text and handwriting. An important aspect of our app… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 30 publications
0
2
0
Order By: Relevance
“…The ICDAR 2013 hassaine competition data set has 475×4 = 1900 handwritten documents. Registration-form documents data set (Tan et al, 2015) has 11118 handwritten documents. Our goal was to identify the gender of the handwriting samples.…”
Section: Optical Character Recognitionmentioning
confidence: 99%
See 1 more Smart Citation
“…The ICDAR 2013 hassaine competition data set has 475×4 = 1900 handwritten documents. Registration-form documents data set (Tan et al, 2015) has 11118 handwritten documents. Our goal was to identify the gender of the handwriting samples.…”
Section: Optical Character Recognitionmentioning
confidence: 99%
“…Two real handwritten data sets were used in our experiments for gender prediction which include the ICDAR 2013 sets (Hassaine et al, 2013) and our registration-form document data sets (Tan et al, 2015).…”
Section: Data Setsmentioning
confidence: 99%