2008
DOI: 10.1117/12.767121
|View full text |Cite
|
Sign up to set email alerts
|

<title>Whole-book recognition using mutual-entropy-driven model adaptation</title>

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2009
2009
2019
2019

Publication Types

Select...
8

Relationship

3
5

Authors

Journals

citations
Cited by 13 publications
(6 citation statements)
references
References 7 publications
0
6
0
Order By: Relevance
“…We discovered that such disagreements are real and can be detected automatically (using cross entropy [58])-further, these disagreements, when summed over long passages (of many pages), correlate significantly with character and word error rates. Thus disagreement, a statistic which the algorithm can estimate, turns out to be a reliable proxy for error rate, which in an unsupervised setting is of course unavailable to the algorithm.…”
Section: B Mutually Correcting Modelsmentioning
confidence: 87%
“…We discovered that such disagreements are real and can be detected automatically (using cross entropy [58])-further, these disagreements, when summed over long passages (of many pages), correlate significantly with character and word error rates. Thus disagreement, a statistic which the algorithm can estimate, turns out to be a reliable proxy for error rate, which in an unsupervised setting is of course unavailable to the algorithm.…”
Section: B Mutually Correcting Modelsmentioning
confidence: 87%
“…For machines, such analysis can also benefit style based document analysis. One example is the whole book recognition [22]. Our Lehigh notebook dataset provides a natural collection of documents that belong to different notebooks.…”
Section: Logical Clusteringmentioning
confidence: 99%
“…Previous attempts [2,5] [4] using recognition and verification and [11] which proposes mutual-entropy based model adaptation and demonstrates it on 10 pages. We investigate a different approach which exploits the similarity of word images in a book.…”
Section: Introductionmentioning
confidence: 99%