2020
DOI: 10.48550/arxiv.2012.15416
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Directed Beam Search: Plug-and-Play Lexically Constrained Language Generation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
14
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(14 citation statements)
references
References 0 publications
0
14
0
Order By: Relevance
“…Previous works proposed beam search variants specifically for lexically constrained decoding (Hokamp & Liu, 2017;Pascual et al, 2020;Lu et al, 2021) which enforce constraints during search in a discrete space. Recent works consider constraint satisfaction by adjusting vocabulary distributions using an additional discriminator or LM (Dathathri et al, 2019;Krause et al, 2021).…”
Section: Related Workmentioning
confidence: 99%
“…Previous works proposed beam search variants specifically for lexically constrained decoding (Hokamp & Liu, 2017;Pascual et al, 2020;Lu et al, 2021) which enforce constraints during search in a discrete space. Recent works consider constraint satisfaction by adjusting vocabulary distributions using an additional discriminator or LM (Dathathri et al, 2019;Krause et al, 2021).…”
Section: Related Workmentioning
confidence: 99%
“…Black-box approaches including "Prompt Engineering" that aim to change the prompts fed into the base LM at inference time (Wallace et al, 2019;Li and Liang, 2021). Guided generation targets at building a controllable "guiding" model that shifts the output from base LM at inference time (Krause et al, 2020;Pascual et al, 2020).…”
Section: Plug-and-play Conditional Generationmentioning
confidence: 99%
“…We seek "plug-and-play" approaches to controllable text generation wherein new language models can be slotted into existing generative systems; new language models are being developed and it becomes intractable to update and retrain controlled generation architectures. Plug-and-play techniques such as (Krause et al, 2020;Pascual et al, 2020) aim to only intervene with the outputs-a vector of logits-of a generative language model. This becomes especially important as the latest iteration of very large pre-trained language models such as GPT-3 (Brown et al, 2020) restrict access to the hidden states and layer weights of models.…”
Section: Introductionmentioning
confidence: 99%
“…The goal is to generate a given set of control words in the responses of one of the speakers (agent or customer). Naive constrained generation approaches (Pascual et al, 2020;Miao et al, 2019) use methods like beam search and stochastic search to force the generation of these control words for short-term control, where control words need to appear in a single utterance or phrase. Because they do not consider the future, these approaches may generate the words all at once in a single response or not generate them at natural places in the conversation (Figure 1, left).…”
Section: Introductionmentioning
confidence: 99%
“…To alleviate this issue, we retrieve similar conversations from training and condition on them during generation. We first identify similar neighbors using a kNN-based approach and then guide the language model towards generating similar responses, inspired by plug-andplay methods (Madotto et al, 2021;Dathathri et al, 2019;Pascual et al, 2020). The motivation for this is that retrieved conversations guide the model to generate the control words at more natural points in the conversation.…”
Section: Introductionmentioning
confidence: 99%