Proceedings of the First Workshop on Narrative Understanding 2019
DOI: 10.18653/v1/w19-2405
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Hierarchical Story Infilling

Abstract: Story infilling involves predicting words to go into a missing span from a story. This challenging task has the potential to transform interactive tools for creative writing. However, state-of-the-art conditional language models have trouble balancing fluency and coherence with novelty and diversity. We address this limitation with a hierarchical model which first selects a set of rare words and then generates text conditioned on that set. By relegating the high entropy task of picking rare words to a word-sam… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
49
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 38 publications
(50 citation statements)
references
References 20 publications
1
49
0
Order By: Relevance
“…Story generation. Recent work seeks to generate stories given a title and storyline (Yao et al, 2019), entities (Clark et al, 2018), premise (Fan et al, 2018), or surrounding context and rare words (Ippolito et al, 2019). Our work differs in that we aim to build systems capable of making predictions based only on text context, rather than aspects specific to stories (e.g.…”
Section: Related Workmentioning
confidence: 99%
“…Story generation. Recent work seeks to generate stories given a title and storyline (Yao et al, 2019), entities (Clark et al, 2018), premise (Fan et al, 2018), or surrounding context and rare words (Ippolito et al, 2019). Our work differs in that we aim to build systems capable of making predictions based only on text context, rather than aspects specific to stories (e.g.…”
Section: Related Workmentioning
confidence: 99%
“…proposes an iterative inference algorithm based on gradient search for text infilling. For story infilling, (Ippolito et al, 2019) first predicts rare words in the missing span, and then generates text conditioned on these words. SpanBERT (Joshi et al, 2020) masks random contiguous spans and (pre-)trains a language model to predict tokens in the span.…”
Section: Related Workmentioning
confidence: 99%
“…Generating a span of missing tokens in a text chunk, known as "text infilling," has attracted many attentions recently (Zhu et al, 2019;Song et al, 2019;Ippolito et al, 2019;Joshi et al, 2020). Here we study the related but somewhat different task of "sentence infilling."…”
Section: Introductionmentioning
confidence: 99%
“…There has been a variety of work focusing on generating stories in plot-controllable, plan-driven, or constrained ways (e.g. (Riedl and Young, 2010;Fan et al, 2018;Peng et al, 2018;Jain et al, 2017;Lebowitz, 1987;Ippolito et al, 2019;Pérez y Pérez and Sharples, 2001)). Similar work in creative generation has conditioned on keywords for poetry generation (Yan, 2016;Ghazvininejad et al, 2016;Wang et al, 2016).…”
Section: Controllable Story Generationmentioning
confidence: 99%