Proceedings of the 28th International Conference on Computational Linguistics 2020
DOI: 10.18653/v1/2020.coling-main.211
|View full text |Cite
|
Sign up to set email alerts
|

Facts2Story: Controlling Text Generation by Key Facts

Abstract: Recent advancements in self-attention neural network architectures have raised the bar for openended text generation. Yet, while current methods are capable of producing a coherent text which is several hundred words long, attaining control over the content that is being generated-as well as evaluating it-are still open questions. We propose a controlled generation task which is based on expanding a sequence of facts, expressed in natural language, into a longer narrative. We introduce human-based evaluation m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 19 publications
0
6
0
Order By: Relevance
“…Wikipedia has also been used to construct datasets for other text generation tasks, such as generating Wikipedia movie plots (Orbach and Goldberg, 2020;Rashkin et al, 2020) and short Wikipedia event summaries (Gholipour Ghalandari et al, 2020), and summarizing Wikipedia documents (Zopf, 2018; or summaries of aspects of interests (Hayashi et al, 2020) from relevant documents.…”
Section: Related Workmentioning
confidence: 99%
“…Wikipedia has also been used to construct datasets for other text generation tasks, such as generating Wikipedia movie plots (Orbach and Goldberg, 2020;Rashkin et al, 2020) and short Wikipedia event summaries (Gholipour Ghalandari et al, 2020), and summarizing Wikipedia documents (Zopf, 2018; or summaries of aspects of interests (Hayashi et al, 2020) from relevant documents.…”
Section: Related Workmentioning
confidence: 99%
“…To alleviate this problem, one research direction adopts coarse-to-fine progressive text generation (Tan et al, 2021). This generation paradigm has been studied in many text generation systems for specific tasks, such as datato-text generation (Moryossef et al, 2019;Puduppully and Lapata, 2021), storytelling (Goldfarb-Tarrant et al, 2020;Orbach and Goldberg, 2020), and dialogue generation (Xu et al, 2020a). Our work adopts a generative event transition planner that is trained on a large amount of event transition paths, aiming to arrange the ensuing events in open-ended text generation.…”
Section: Related Workmentioning
confidence: 99%
“…To alleviate this problem, one research direction adopts coarse-to-fine progressive text generation (Tan et al, 2021). This generation paradigm has been studied in many text generation systems for specific tasks, such as datato-text generation (Moryossef et al, 2019;Puduppully and Lapata, 2021), storytelling (Goldfarb-Tarrant et al, 2020;Orbach and Goldberg, 2020), and dialogue generation (Xu et al, 2020a). Our work adopts a generative event transition planner that is trained on a large amount of event transition paths, aiming to arrange the ensuing events in open-ended text generation.…”
Section: Related Workmentioning
confidence: 99%