Proceedings of the 2nd Workshop on Representation Learning for NLP 2017
DOI: 10.18653/v1/w17-2629
|View full text |Cite
|
Sign up to set email alerts
|

Adversarial Generation of Natural Language

Abstract: Generative Adversarial Networks (GANs) have gathered a lot of attention from the computer vision community, yielding impressive results for image generation. Advances in the adversarial generation of natural language from noise however are not commensurate with the progress made in generating images, and still lag far behind likelihood based methods. In this paper, we take a step towards generating natural language with a GAN objective alone. We introduce a simple baseline that addresses the discrete output sp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
41
0
1

Year Published

2018
2018
2020
2020

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 72 publications
(42 citation statements)
references
References 26 publications
0
41
0
1
Order By: Relevance
“…Correspondingly, adversarial training has also been investigated for both text generation [86], [87] and speech synthesis [78]. In particular, sentence generation conditioned on sentiment (either positive or negative) has been conducted in [14] and [21], but both only on fixed-length sequences (11 words in [14] and 40 words in [21]). One example can be the three generated samples with a fixed length (40 words) found in [21] (also shown in Table I).…”
Section: Approaches In Other Modalitiesmentioning
confidence: 99%
“…Correspondingly, adversarial training has also been investigated for both text generation [86], [87] and speech synthesis [78]. In particular, sentence generation conditioned on sentiment (either positive or negative) has been conducted in [14] and [21], but both only on fixed-length sequences (11 words in [14] and 40 words in [21]). One example can be the three generated samples with a fixed length (40 words) found in [21] (also shown in Table I).…”
Section: Approaches In Other Modalitiesmentioning
confidence: 99%
“…Curriculum learning has been successfully applied to GAN in several domains. Subramanian et al (2017) and Press et al (2017) use curriculum learning for text generation from gradually increasing the length of character sequences in text as the training progresses. Karras et al (2018) apply curriculum learning on image generation by increasing the image resolution.…”
Section: Related Workmentioning
confidence: 99%
“…Computations for probabilistic inference are often intractable, and GAN provides a viable alternative that uses neural architectures to train a generative model. GAN has been received well and applied to a variety of synthesis tasks (Karras et al 2018;Subramanian et al 2017;Radford, Metz, and Chintala 2015;Reed et al 2016b).…”
Section: Introductionmentioning
confidence: 99%
“…More recently, Bengio et al [1] provided additional evidence that curriculum strategies can benefit neural network training with experimental results on different tasks such as shape recognition and language modelling. Since then, empirical successes were observed for several computer vision [14,49] and natural language processing (NLP) tasks [36,42,60].…”
Section: Introductionmentioning
confidence: 99%