Findings of the Association for Computational Linguistics: NAACL 2022 2022
DOI: 10.18653/v1/2022.findings-naacl.74
|View full text |Cite
|
Sign up to set email alerts
|

MWP-BERT: Numeracy-Augmented Pre-training for Math Word Problem Solving

Abstract: Math word problem (MWP) solving faces a dilemma in number representation learning. In order to avoid the number representation issue and reduce the search space of feasible solutions, existing works striving for MWP solving usually replace real numbers with symbolic placeholders to focus on logic reasoning. However, different from common symbolic reasoning tasks like program synthesis and knowledge graph reasoning, MWP solving has extra requirements in numerical reasoning. In other words, instead of the number… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
15
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 47 publications
(44 citation statements)
references
References 33 publications
0
15
0
Order By: Relevance
“…Expression-Pointer Transformer (EPT) [21] adds the expression fragmentation and the operand context separation to improve the transformerbased approach. Some work [22] uses the BERT encoder [6] to benefit from the power of the state-of-the-art language model to learn the meaning of natural sentences. However, they lack the structural understanding of MWPs, which are the relationships between the numbers of the words.…”
Section: Related Workmentioning
confidence: 99%
“…Expression-Pointer Transformer (EPT) [21] adds the expression fragmentation and the operand context separation to improve the transformerbased approach. Some work [22] uses the BERT encoder [6] to benefit from the power of the state-of-the-art language model to learn the meaning of natural sentences. However, they lack the structural understanding of MWPs, which are the relationships between the numbers of the words.…”
Section: Related Workmentioning
confidence: 99%
“…Previous work has seen modest success on simpler or specialized mathematics problem benchmarks. Techniques based on cotraining output to verify ( 9 , 10 ) or predict expression trees ( 11 16 ), such as MAWPS and Math23k, are able to solve elementary school-level math problems with over 81% accuracy. However, these approaches do not extend to high-school, math Olympiad, or university-level courses.…”
Section: Related Workmentioning
confidence: 99%
“…Each math word problem can be solved by one linear algebra expression. • Ape-clean 2 : The dataset Ape-clean [34] is the cleaned version of the Chinese MWPs dataset Ape210k [35].…”
Section: Dataset and Baselinesmentioning
confidence: 99%