2019
DOI: 10.1109/taslp.2019.2943018
|View full text |Cite
|
Sign up to set email alerts
|

BFGAN: Backward and Forward Generative Adversarial Networks for Lexically Constrained Sentence Generation

Abstract: Incorporating prior knowledge like lexical constraints into the model's output to generate meaningful and coherent sentences has many applications in dialogue system, machine translation, image captioning, etc. However, existing RNN-based models incrementally generate sentences from left to right via beam search, which makes it difficult to directly introduce lexical constraints into the generated sentences. In this paper, we propose a new algorithmic framework, dubbed BFGAN, to address this challenge. Specifi… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
26
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 34 publications
(26 citation statements)
references
References 23 publications
0
26
0
Order By: Relevance
“…We follow the work of [6] and extend the backward and forward generators to adapt to the loanword identification task. In our study, we use the loanwords of a specific language as the lexical constraint to generate more training data.…”
Section: Generatorsmentioning
confidence: 99%
See 2 more Smart Citations
“…We follow the work of [6] and extend the backward and forward generators to adapt to the loanword identification task. In our study, we use the loanwords of a specific language as the lexical constraint to generate more training data.…”
Section: Generatorsmentioning
confidence: 99%
“…As a common used method to alleviate the data sparseness, data augmentation is one of the most popular methods in this topic. For example, Liu et al [6] proposed to use a GAN model consisting of two generators and one discriminator to produce meaningful natural language sentences. Motivated by this study, we propose to use a lexical constraint-based data augmentation model to generate more training data for loanword identification.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The authors declare no conflict of interest. [20] x x Visual Genome Dataset [21] x x x 19.9 13.7 13.1 [22] x x x x [23] x [35] x x Recall Evaluation metric [36] x x OI, VG, VRD [37] x x X X 71.6 51.8 37.1 26.5 24.3 [38] x x APRC, CSMC [39] x x x x F-1 score metrics 21.6 [45] x x x x IAPRTC-12 [46] x x x x [47] x x x x x R [48] x…”
Section: Conflicts Of Interestmentioning
confidence: 99%
“…Text infilling aims at filling in the missing part of a sentence or paragraph by making use of the past and future information around the missing part, which can be used in many real-world natural language generation scenarios, for example, fill-in-the-blank image captioning (Sun et al, 2017), lexically constrained sentence generation (Liu et al, 2018b), missing value reconstruction (e.g. for damaged or historical documents) (Berglund et al, 2015), acrostic poetry generation (Liu et al, 2018a), and text representation learning (Devlin et al, 2018).…”
Section: Introductionmentioning
confidence: 99%