2017
DOI: 10.48550/arxiv.1705.10929
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Adversarial Generation of Natural Language

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
57
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 60 publications
(57 citation statements)
references
References 13 publications
0
57
0
Order By: Relevance
“…In such a framework, the generator is not directly exposed to the ground truth data, but instead learns to imitate it using global feedback from the discriminator. This has led to several attempts to use GANs for text generation, with a generator using either a recurrent neural network (RNN) Guo et al, 2017;Press et al, 2017;Rajeswar et al, 2017), or a Convolutional Neural Network (CNN) (Gulrajani et al, 2017;Rajeswar et al, 2017).…”
Section: Introductionmentioning
confidence: 99%
“…In such a framework, the generator is not directly exposed to the ground truth data, but instead learns to imitate it using global feedback from the discriminator. This has led to several attempts to use GANs for text generation, with a generator using either a recurrent neural network (RNN) Guo et al, 2017;Press et al, 2017;Rajeswar et al, 2017), or a Convolutional Neural Network (CNN) (Gulrajani et al, 2017;Rajeswar et al, 2017).…”
Section: Introductionmentioning
confidence: 99%
“…Over the years, GAN [54] has been used in many computer vision related tasks [55]- [57] as well as in the natural language domain [58], [59]. Furthermore, GAN has also been used for reconstructing medical images [60]- [62], face images [63] etc.…”
Section: B Applications Of Ganmentioning
confidence: 99%
“…They were among the first to apply such techniques to neural language generation, but to date, training with log-likelihood maximization (Xie, 2017) has been the main working horse. Alternatively, Rajeswar et al (2017) and have tried using Generative Adversarial Neural Networks (GANs) for text generation. However, Caccia et al (2018) observed problems with training GANs and show that to date, they are unable to beat canonical sequence decoder methods.…”
Section: Language Generationmentioning
confidence: 99%