2020
DOI: 10.48550/arxiv.2006.14223
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Neural Machine Translation For Paraphrase Generation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
10
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(10 citation statements)
references
References 0 publications
0
10
0
Order By: Relevance
“…Kazemnejad et al (2020) proposed a retrievalbased approach to retrieve paraphrase from a large corpus. ; Sokolov and Filimonov (2020) casted paraphrase generation as the task of machine translation. ; extended the idea of bilingual pivoting for paraphrase generation where the input sentence is first translated into a foreign language, and then translated back as the paraphrase.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Kazemnejad et al (2020) proposed a retrievalbased approach to retrieve paraphrase from a large corpus. ; Sokolov and Filimonov (2020) casted paraphrase generation as the task of machine translation. ; extended the idea of bilingual pivoting for paraphrase generation where the input sentence is first translated into a foreign language, and then translated back as the paraphrase.…”
Section: Related Workmentioning
confidence: 99%
“…; extended the idea of bilingual pivoting for paraphrase generation where the input sentence is first translated into a foreign language, and then translated back as the paraphrase. Sokolov and Filimonov (2020) trained a MT model using multilingual parallel data and then finetuned the model using parallel paraphrase data.…”
Section: Related Workmentioning
confidence: 99%
“…; Qian et al (2019) employed distinct semantic style embeddings to generate diverse paraphrases, and Iyyer et al (2018); ; Chen et al (2019); Goyal and Durrett (2020) proposed to use different syntactic structure templates. A line of work Sokolov and Filimonov, 2020) formalized paraphrase generation as machine translation. Unsupervised paraphrase generation is primarily based on reinforcement learning (RL) generative models (Ranzato et al, 2015;Li et al, 2016b).…”
Section: Related Workmentioning
confidence: 99%
“…Building a strong paraphrase generation system usually requires massive amounts of high-quality annotated paraphrase pairs, but existing labeled datasets (Lin et al, 2014;Fader et al, 2013;Lan et al, 2017) are either of small sizes or restricted in narrow domains. To avoid such a heavy reliance on labeled datasets, recent works have explored unsupervised methods (Li et al, 2018b;Fu et al, 2019;Siddique et al, 2020) to generate paraphrase without annotated training data, among which the backtranslation based model is an archetype Sokolov and Filimonov, 2020). It borrows the idea of back-translation (BT) in machine translation (Sennrich et al, 2016) where the model first translates a sentence s 1 into another sentence s 2 in a different language (e.g., En→Fr), and then translates s 2 back to s 1 .…”
Section: Introductionmentioning
confidence: 99%
“…Malandrakis et al (2019) introduce a similar notion of paraphrasing and apply variational autoencoders to control the quality of paraphrases. Sokolov and Filimonov (2020) tackle a similar problem of paraphrasing utterances with entity types, but implement the slot copy mechanism via pre-processing and post-processing. In addition, Liu et al (2013) apply paraphrases to improve natural understanding in an NLU system, both for augmenting rules and for enhancing features.…”
Section: Related Workmentioning
confidence: 99%