2019 IEEE Automatic Speech Recognition and Understanding Workshop (ASRU) 2019
DOI: 10.1109/asru46091.2019.9003874
|View full text |Cite
|
Sign up to set email alerts
|

Paraphrase Generation Based on VAE and Pointer-Generator Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 12 publications
0
2
0
Order By: Relevance
“…Our tagging technique aims to achieve a good paraphrase generation performance with a low-cost finetuning task that utilizes already published pretrained models. As compare to other Pointer Generator Network approaches like (Ravuru et al, 2019), our approach is cheaper and more flexible given pre-trained weights. Furthermore, easy modifications can be made based on our source code to make TAGPA work with other (future) pretrained encoder-decoder architectures.…”
Section: Why Mbart?mentioning
confidence: 99%
“…Our tagging technique aims to achieve a good paraphrase generation performance with a low-cost finetuning task that utilizes already published pretrained models. As compare to other Pointer Generator Network approaches like (Ravuru et al, 2019), our approach is cheaper and more flexible given pre-trained weights. Furthermore, easy modifications can be made based on our source code to make TAGPA work with other (future) pretrained encoder-decoder architectures.…”
Section: Why Mbart?mentioning
confidence: 99%
“…PGN and a new loss function are also introduced to address these issues. Ravuru et al proposed combining VAE [12] and the PGN [8] and applying them to the task of paraphrasing [16]. Zhang et al combined the Transformer architecture with the PGN [17].…”
Section: B Pointer Generator Network (Pgn)mentioning
confidence: 99%