2021
DOI: 10.1007/978-3-030-86365-4_6
|View full text |Cite
|
Sign up to set email alerts
|

Generating Math Word Problems from Equations with Topic Consistency Maintaining and Commonsense Enforcement

Abstract: Recent years have seen significant advancement in text generation tasks with the help of neural language models. However, there exists a challenging task: generating math problem text based on mathematical equations, which has made little progress so far. In this paper, we present a novel equation-toproblem text generation model. In our model, 1) we propose a flexible scheme to effectively encode math equations, we then enhance the equation encoder by a Varitional Autoencoder (VAE) 2) given a math equation, we… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 34 publications
0
3
0
Order By: Relevance
“…So this method is time-consuming and limited. Cao et al (2021) incorporates topic controlling and commonsense enforcement in MWP generation. Data-to-text Generation: Data-to-text generation transforms structured data into descriptive texts (Siddharthan, 2001;Gatt and Krahmer, 2018).…”
Section: Related Workmentioning
confidence: 99%
“…So this method is time-consuming and limited. Cao et al (2021) incorporates topic controlling and commonsense enforcement in MWP generation. Data-to-text Generation: Data-to-text generation transforms structured data into descriptive texts (Siddharthan, 2001;Gatt and Krahmer, 2018).…”
Section: Related Workmentioning
confidence: 99%
“…A Variational Auto-Encoder (VAE) is used to generate the MWP from this encoding. Cao et al (2021) also make use of a VAE to bridge the gap between abstract math tokens and text. In addition to the equation and common sense knowledge graph as input, they take in the question text, as well as a set of words representing a topic.…”
Section: Mwp Generationmentioning
confidence: 99%
“…Early solutions to MWP generation relied on template-based approaches (Polozov et al, 2015), and question rewriting (Koncel-Kedziorski et al, 2016). More recently, Recurrent Neural Networks (RNN) (Zhou and Huang, 2019;Liyanage and Ranathunga, 2020), fine-tuning pre-trained language models as well as Variational Autoencoders (VAE) Cao et al, 2021) have been proposed. Only Liyanage and Ranathunga (2020) have tried their NN solution in a multilingual setting, however the results are sub-optimal.…”
Section: Introductionmentioning
confidence: 99%