2011 International Conference on Document Analysis and Recognition 2011
DOI: 10.1109/icdar.2011.75
|View full text |Cite
|
Sign up to set email alerts
|

Stroke-Based Performance Metrics for Handwritten Mathematical Expressions

Abstract: Abstract-Evaluating mathematical expression recognition involves a complex interaction of input primitives (e.g. pen/finger strokes), recognized symbols, and recognized spatial structure. Existing performance metrics simplify this problem by separating the assessment of spatial structure from the assessment of symbol segmentation and classification. These metrics do not characterize the overall accuracy of a penbased mathematics recognition, making it difficult to compare math recognition algorithms, and preve… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
19
0

Year Published

2011
2011
2021
2021

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 24 publications
(20 citation statements)
references
References 10 publications
0
19
0
Order By: Relevance
“…The main problem of the EMERS metric is that the representation ambiguity of coded mathematical expressions can produce different trees of the same expression. In [41], the authors proposed a representation that seems to cope with this ambiguity and they also presented their corresponding performance metrics. However, the representation required precise information of the structure of the expression at stroke level, and the MathBrush corpus annotation had not completely and explicitly this information.…”
Section: Spatial Relations Classificationmentioning
confidence: 99%
See 2 more Smart Citations
“…The main problem of the EMERS metric is that the representation ambiguity of coded mathematical expressions can produce different trees of the same expression. In [41], the authors proposed a representation that seems to cope with this ambiguity and they also presented their corresponding performance metrics. However, the representation required precise information of the structure of the expression at stroke level, and the MathBrush corpus annotation had not completely and explicitly this information.…”
Section: Spatial Relations Classificationmentioning
confidence: 99%
“…Evaluation of mathematical expression recognition systems is a difficult problem [5,19], and several metrics have been proposed [30,41,1]. The EMERS metric [30] is a dissimilitude value computed as the tree edit distance between the tree representation of the obtained expression and the reference expression, ant it is not normalized.…”
Section: Spatial Relations Classificationmentioning
confidence: 99%
See 1 more Smart Citation
“…EMERS computes an edit distance using the tree representation of the ME. Zanibbi et al [11] defined a set of performance metrics at different levels based on bipartite graph representation: that different metrics seem to provide a canonical representation, but it is not detailed in the paper and no experimentation is reported.…”
Section: Evaluation Of Me Recognition Systemsmentioning
confidence: 99%
“…Given that the recognition of mathematical symbols can be stated as a regular classification problem, the classification error rate of individual symbols is usually provided as a performance measure. However the recognition of the structural relation between mathematical symbols, which can be seen a parsing problem, requires more sophisticated evaluation methods [8,11].…”
Section: Introductionmentioning
confidence: 99%