2021
DOI: 10.1007/978-3-030-86159-9_29
|View full text |Cite
|
Sign up to set email alerts
|

A Transformer-Based Math Language Model for Handwritten Math Expression Recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 13 publications
0
4
0
Order By: Relevance
“…A standard encoder-decoder model generates a LaTeX representation without considering this ambiguity, and so a decoder may generate an ungrammatical sequence such as x^{2 and x^{2}} for the expression x^{2}. To overcome this problem, many systems use linguistic context from mathematical language models (LMs) to support the decoding process [11,58,87,88]. While these LMs tend to be much simpler than full expression grammars, they serve the same purpose by introducing syntactic constraints in the space of possible expressions.…”
Section: Multimodal: Combining Online and Offline Inputsmentioning
confidence: 99%
See 2 more Smart Citations
“…A standard encoder-decoder model generates a LaTeX representation without considering this ambiguity, and so a decoder may generate an ungrammatical sequence such as x^{2 and x^{2}} for the expression x^{2}. To overcome this problem, many systems use linguistic context from mathematical language models (LMs) to support the decoding process [11,58,87,88]. While these LMs tend to be much simpler than full expression grammars, they serve the same purpose by introducing syntactic constraints in the space of possible expressions.…”
Section: Multimodal: Combining Online and Offline Inputsmentioning
confidence: 99%
“…Language Models (LMs) have often been used to resolve ambiguities in HME recognition [20,11,19,58,87,88], although they are generally not as effective as for natural languages where redundancy is much higher, and so misrecognized characters can often be replaced reliably by correct characters. However, mathematical expressions follow a more formal language with less redundancy (e.g., a specific superscript in a formula appears just once).…”
Section: Multimodal: Combining Online and Offline Inputsmentioning
confidence: 99%
See 1 more Smart Citation
“…Tree decoders pay less attention to context when predicting triples, often unable to distinguish differences such as '2' and 'z'. Ung et al (2021) try to employ a language model (LM) for post-correction, but Gupta et al (2021) underline the inherent risk of wholly depending on a LM for the correction of low-redundancy information, such as numbers, which is particularly susceptible to biases introduced by probabilistic skewing.…”
Section: Introductionmentioning
confidence: 99%