Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume 2021
DOI: 10.18653/v1/2021.eacl-main.223
|View full text |Cite
|
Sign up to set email alerts
|

Changing the Mind of Transformers for Topically-Controllable Language Generation

Abstract: Large Transformer-based language models can aid human authors by suggesting plausible continuations of text written so far. However, current interactive writing assistants do not allow authors to guide text generation in desired topical directions. To address this limitation, we design a framework that displays multiple candidate upcoming topics, of which a user can select a subset to guide the generation. Our framework consists of two components: (1) a method that produces a set of candidate topics by predict… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 35 publications
0
2
0
Order By: Relevance
“…Guiding model output entails human intervention in the automatic text generation process, such as specifying crucial parts of input text to be included in the output (Gehrmann et al, 2019), or providing semantic prompts or style parameters (Osone et al, 2021;Chang et al, 2021;Strobelt et al, 2022). Our study involves human guidance in headline generation by specifying keyword prompts.…”
Section: Human-ai Interactions For Text Summarizationmentioning
confidence: 99%
“…Guiding model output entails human intervention in the automatic text generation process, such as specifying crucial parts of input text to be included in the output (Gehrmann et al, 2019), or providing semantic prompts or style parameters (Osone et al, 2021;Chang et al, 2021;Strobelt et al, 2022). Our study involves human guidance in headline generation by specifying keyword prompts.…”
Section: Human-ai Interactions For Text Summarizationmentioning
confidence: 99%
“…Paul, Chang, and McCallum (2021) use the co-occurring relation between a sentence pattern and its entity pair to improve relation extraction in Verga et al (2016). Chang et al (2021) use the co-occurring relation between a context paragraph and its subsequent words to control the topics of language generation. In the future, the approach might also be used to improve the efficiency of document similarity estimation (Luan et al 2020).…”
Section: Ethics Statementmentioning
confidence: 99%