2020 17th International Conference on Frontiers in Handwriting Recognition (ICFHR) 2020
DOI: 10.1109/icfhr2020.2020.00071
|View full text |Cite
|
Sign up to set email alerts
|

Online Handwritten Mathematical Symbol Segmentation and Recognition with Bidirectional Context

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 17 publications
0
6
0
Order By: Relevance
“…We re-implemented the online bag-of-features proposed by Ung et al [6] (denoted as M1) to compare with our proposed method. M1 used an OnHME recognizer outputting a symbol relation tree instead of a LaTeX sequence, so that we used the OnHME recognizer proposed in [23] combined with an n-gram language model to extract the online bag-of-features. This recognizer achieves the expression rate of 51.70% on the CROHME 2014 testing set, which is 2.82 percentage points better than MTAP.…”
Section: Discussionmentioning
confidence: 99%
“…We re-implemented the online bag-of-features proposed by Ung et al [6] (denoted as M1) to compare with our proposed method. M1 used an OnHME recognizer outputting a symbol relation tree instead of a LaTeX sequence, so that we used the OnHME recognizer proposed in [23] combined with an n-gram language model to extract the online bag-of-features. This recognizer achieves the expression rate of 51.70% on the CROHME 2014 testing set, which is 2.82 percentage points better than MTAP.…”
Section: Discussionmentioning
confidence: 99%
“…Nguyen et al applied a temporal classification method for recognizing mathematical symbols [7]. The method uses a BLSTM model to take advantage of the bi-directional context for classifying symbols.…”
Section: Symbol Recognition With Bidirectional Contextmentioning
confidence: 99%
“…The deep BLSTM model is trained using a combination of the constraint loss and CTC loss as shown in Eq. (7). π‘™π‘œπ‘ π‘  = π‘™π‘œπ‘ π‘  𝐢𝑇𝐢 + πœ†π‘™π‘œπ‘ π‘  𝐢𝐸 (7) where πœ† is a weighted parameter that is determined experimentally.…”
Section: Constraint For Output At Precise Time Stepsmentioning
confidence: 99%
See 2 more Smart Citations