Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) 2020
DOI: 10.18653/v1/2020.emnlp-main.349
|View full text |Cite
|
Sign up to set email alerts
|

PlotMachines: Outline-Conditioned Generation with Dynamic Plot State Tracking

Abstract: We propose the task of outline-conditioned story generation: given an outline as a set of phrases that describe key characters and events to appear in a story, the task is to generate a coherent narrative that is consistent with the provided outline. This task is challenging as the input only provides a rough sketch of the plot, and thus, models need to generate a story by interweaving the key points provided in the outline. This requires the model to keep track of the dynamic states of the latent plot, condit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
33
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 59 publications
(34 citation statements)
references
References 24 publications
0
33
0
1
Order By: Relevance
“…GPT-2 GPT-2 (Radford et al, 2019) is a transformer-based pretrained language model that has been exploited in various generation tasks like story generation (Dathathri et al, 2020;Rashkin et al, 2020). However, one issue with the GPT-2 model is that it can only perform uni-directional generation.…”
Section: Baselines: Event Infillingmentioning
confidence: 99%
“…GPT-2 GPT-2 (Radford et al, 2019) is a transformer-based pretrained language model that has been exploited in various generation tasks like story generation (Dathathri et al, 2020;Rashkin et al, 2020). However, one issue with the GPT-2 model is that it can only perform uni-directional generation.…”
Section: Baselines: Event Infillingmentioning
confidence: 99%
“…Story Generation Inspired by the traditional pipeline of Reiter and Dale (2000), recent work tackles generation of stories in a coarse-to-fine manner (Fan et al, 2018): based on a premise, a structured outline is generated first, and then an outline-condition model generates the full story. To represent the story outline, existing approaches typically either model it as a latent variable, or use symbolic representations such as key phrases (Xu et al, 2018;Yao et al, 2019;Goldfarb-Tarrant et al, 2019;Gupta et al, 2019;Rashkin et al, 2020), short summaries (Jain et al, 2017;Chen et al, 2019), verb-argument tuples (Martin et al, 2018), or PropBank predicates and arguments (Fan et al, 2019;Goldfarb-Tarrant et al, 2020). Our work can be viewed as an extension of this direction, where a Content Planner model generates an outline as a sequence of FrameNet frames, and our methods generate a surface form story.…”
Section: Related Workmentioning
confidence: 99%
“…Recent work leverages the success of large pretrained language models to generate long texts such as stories (Rashkin et al, 2020), reviews (Cho et al, 2019a and fake news (Zellers et al, 2019). Most end-user applications for assisting user writing, however, are confined to sentence-level generation Kannan et al, 2016;Alikaniotis and Raheja, 2019;Prabhumoye et al, 2019;Faltings et al, 2021).…”
Section: Document Generationmentioning
confidence: 99%
“…Large pre-trained language models such as T5 and GPT-3 (Raffel et al, 2019;Brown et al, 2020) have enabled impressive progress on a variety of natural language generation tasks by producing fluent, coherent long texts (Rashkin et al, 2020;Zellers et al, 2019). While automated document-level generation seems tantalizingly within reach, a high branching factor presents significant challenges in tailoring generated documents to the specific requirements of users.…”
Section: Introductionmentioning
confidence: 99%