Proceedings of the Second Workshop on Storytelling 2019
DOI: 10.18653/v1/w19-3407
|View full text |Cite
|
Sign up to set email alerts
|

Narrative Generation in the Wild: Methods from NaNoGenMo

Abstract: In text generation, generating long stories is still a challenge. Coherence tends to decrease rapidly as the output length increases. Especially for generated stories, coherence of the narrative is an important quality aspect of the output text. In this paper we examine how narrative coherence is attained in the submissions of NaNoGenMo 2018, an online text generation event where participants are challenged to generate a 50,000 word novel. We list the main approaches that were used to generate coherent narrati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 9 publications
(7 citation statements)
references
References 27 publications
0
7
0
Order By: Relevance
“…Various authors have investigated ways to improve the coherence of generated texts, for example by using planning strategies [13,29], learning frameworks [12] and coherence metrics [19]. Some generation systems do not actually enforce coherence in their outputs, but only suggest it through their form or content to trick the reader [11,24].…”
Section: Neural Language Generationmentioning
confidence: 99%
“…Various authors have investigated ways to improve the coherence of generated texts, for example by using planning strategies [13,29], learning frameworks [12] and coherence metrics [19]. Some generation systems do not actually enforce coherence in their outputs, but only suggest it through their form or content to trick the reader [11,24].…”
Section: Neural Language Generationmentioning
confidence: 99%
“…For example, if at some point a protagonist opens a window to air the room and the antagonist enters the building through this open window later as the story unfolds, such 'window' could be regarded as a perfect example of Chekhov's Gun. Although there are several recent results in the areas of suspense generation Doust and Piwek (2017), narrative personalization Wang et al (2017), and generation of short context-based narratives Womack and Freeman (2019), generating long stories is still a challenge van Stegeren and Theune (2019). We believe that CGR could provide deep insights into further computational research of narrative structure and is a vital component for the generation of longer entertaining stories.…”
Section: Cgr Taskmentioning
confidence: 98%
“…While existing models can generate stories with good local coherence, generating long stories is challenging. Difficulties in coalescing individual phrases into coherent plots and in maintaining character consistency throughout the story lead to a rapid decrease in coherence as the output length increases (van Stegeren and Theune, 2019). Neural narrative generation combining story-writing with human collaboration in an interactive way improves both story quality and human engagement (Goldfarb-Tarrant et al, 2019).…”
Section: Narrative Generation / Story Tellingmentioning
confidence: 99%