2019
DOI: 10.48550/arxiv.1907.08259
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

WriterForcing: Generating more interesting story endings

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 0 publications
0
4
0
Order By: Relevance
“…Among the various attempts of language generation, story generation has been one of the most popular. In [18], the authors generated interesting and more diverse story endings by training models to emphasize on specific keywords and generate non-generic phrases. Luo et.al incorporated sentiment analysis into story generation by implementing a sentimental analyser and a sentimental generator [25].…”
Section: Language Generationmentioning
confidence: 99%
“…Among the various attempts of language generation, story generation has been one of the most popular. In [18], the authors generated interesting and more diverse story endings by training models to emphasize on specific keywords and generate non-generic phrases. Luo et.al incorporated sentiment analysis into story generation by implementing a sentimental analyser and a sentimental generator [25].…”
Section: Language Generationmentioning
confidence: 99%
“…Previous works (Li, Ding, and Liu 2018;Gupta et al 2019) are mainly based on Sequence-to-Sequence (Seq2Seq) model (Luong, Pham, and Manning 2015). Because of generating a sentence at a stroke in a left-to-right manner and training with Maximum Likelihood Estimate, they suffer from a well known issue of generating noncoherent and generic plots.…”
Section: Introductionmentioning
confidence: 99%
“…ACAT (Assistive Context Aware Toolkit) (Nachman, Prasad et al 2018) project is an open source platform developed at Intel Labs to enable people with motor neuron diseases to have full access to the capabilities and applications of their computers using very limited user input (e.g. gaze, single muscle movement, facial gesture, etc).…”
Section: Introductionmentioning
confidence: 99%
“…Response Generator is displayed twice, since it is called twice. loss functions (See et al 2019;Gupta et al 2019), ranking responses (Gao et al 2020), etc. Very few methods today apply online learning approaches to train these generative models(yang Wu, Li, and Yu 2020).…”
Section: Introductionmentioning
confidence: 99%