2018
DOI: 10.1609/aaai.v32i1.11956
|View full text |Cite
|
Sign up to set email alerts
|

A Deep Generative Framework for Paraphrase Generation

Abstract: Paraphrase generation is an important problem in NLP, especially in question answering, information retrieval, information extraction, conversation systems, to name a few. In this paper, we address the problem of generating paraphrases automatically. Our proposed method is based on a combination of deep generative models (VAE) with sequence-to-sequence models (LSTM) to generate paraphrases, given an input sentence. Traditional VAEs when combined with recurrent neural networks can generate free text but they ar… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
19
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 71 publications
(19 citation statements)
references
References 18 publications
0
19
0
Order By: Relevance
“…Paraphrase Generation has proven to be useful for adversarial training and data augmentation (Zhou and Bhat, 2021). Early methods adopt hand-crafted rules (McKeown, 1983), synonym substitution (Bolshakov and Gelbukh, 2004), machine translation (Quirk et al, 2004), and deep learning (Gupta et al, 2018; to improve the quality of generated sentences. To acquire syntactic diverse samples, recent studies involve reinforcement learning (Qian et al, 2019) or syntactic constrains (Iyyer et al, 2018;Goyal and Durrett, 2020;Sun et al, 2021) into the models.…”
Section: Related Workmentioning
confidence: 99%
“…Paraphrase Generation has proven to be useful for adversarial training and data augmentation (Zhou and Bhat, 2021). Early methods adopt hand-crafted rules (McKeown, 1983), synonym substitution (Bolshakov and Gelbukh, 2004), machine translation (Quirk et al, 2004), and deep learning (Gupta et al, 2018; to improve the quality of generated sentences. To acquire syntactic diverse samples, recent studies involve reinforcement learning (Qian et al, 2019) or syntactic constrains (Iyyer et al, 2018;Goyal and Durrett, 2020;Sun et al, 2021) into the models.…”
Section: Related Workmentioning
confidence: 99%
“…Paraphrases express the surface forms of the underlying semantic content [6] and capture the essence of language diversity [46]. Early work on automatic generation of paraphrase are generally rule-based [7,8], but the recent trend brings to fore neural network solutions [47,19,10,12,13,20,6]. Current research for paraphrasing mainly focuses on supervised methods, which require the availability of a large number of source and target pairs.…”
Section: Paraphrase Generationmentioning
confidence: 99%
“…Prakash et al [9] and Gupta et al [10] have proposed applying the Seq2Seq model, which is based on Long Short-2 Term Memory (LSTM), for paraphrasing. Prakash et al noted that applying the residual network architecture [11] to the stacked LSTM network architecture [9] was beneficial for training.…”
Section: A Seq2seqmentioning
confidence: 99%