2021
DOI: 10.48550/arxiv.2112.03603
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Handwritten Mathematical Expression Recognition via Attention Aggregation based Bi-directional Mutual Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…With the advent of Transformer architecture [3], there has been a pivotal shift in the approach to solving OCR problems. Sequence-to-sequence models have gained popularity [7], where attention modules were used in conjunction with CNNs to extract features and BiLSTMs were used as the decoder [10]. For the transformer approach, the input is required to be a sequence, which is generally done by CNNs to transform a two-dimensional input into a one-dimensional sequence.…”
Section: Optical Character Recognitionmentioning
confidence: 99%
“…With the advent of Transformer architecture [3], there has been a pivotal shift in the approach to solving OCR problems. Sequence-to-sequence models have gained popularity [7], where attention modules were used in conjunction with CNNs to extract features and BiLSTMs were used as the decoder [10]. For the transformer approach, the input is required to be a sequence, which is generally done by CNNs to transform a two-dimensional input into a one-dimensional sequence.…”
Section: Optical Character Recognitionmentioning
confidence: 99%