2021
DOI: 10.48550/arxiv.2105.02412
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Handwritten Mathematical Expression Recognition with Bidirectionally Trained Transformer

Abstract: Encoder-decoder models have made great progress on handwritten mathematical expression recognition recently. However, it is still a challenge for existing methods to assign attention to image features accurately. Moreover, those encoder-decoder models usually adopt RNNbased models in their decoder part, which makes them inefficient in processing long L A T E X sequences. In this paper, a transformer-based decoder is employed to replace RNN-based ones, which makes the whole model architecture very concise. Furt… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 38 publications
(51 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?