Proceedings of the 12th International Conference on Natural Language Generation 2019
DOI: 10.18653/v1/w19-8614
|View full text |Cite
|
Sign up to set email alerts
|

Generating Text from Anonymised Structures

Abstract: Surface realisation maps a meaning representation (MR) to a text, usually a single sentence. In this paper, we introduce a new parallel dataset of deep meaning representations and French sentences and we present a novel method for MR-to-text generation which seeks to generalise by abstracting away from lexical content. Most current work on natural language generation focuses on generating text that matches a reference using BLEU as evaluation criteria. In this paper, we additionally consider the model's abilit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 20 publications
0
6
0
Order By: Relevance
“…With recent advances in neural networks, researchers have built different neural models based on various strategies, e.g. latentvariables (Wiseman et al, 2018;Ye et al, 2020), structure awareness (Liu et al, 2018;Colin and Gardent, 2019), copy mechanism (Gehrmann et al, 2018;Puduppully et al, 2019a,b), and pre-trained language models (PLMs) (Chen et al, 2020a;Kale, 2020;Ribeiro et al, 2020) 2020) adapted the pre-trained GPT-2 model with different architectural designs, e.g. switch policy (Chen et al, 2020b) and content matching (Gong et al, 2020), to address the few-shot table-to-text generation problem.…”
Section: Discussionmentioning
confidence: 99%
“…With recent advances in neural networks, researchers have built different neural models based on various strategies, e.g. latentvariables (Wiseman et al, 2018;Ye et al, 2020), structure awareness (Liu et al, 2018;Colin and Gardent, 2019), copy mechanism (Gehrmann et al, 2018;Puduppully et al, 2019a,b), and pre-trained language models (PLMs) (Chen et al, 2020a;Kale, 2020;Ribeiro et al, 2020) 2020) adapted the pre-trained GPT-2 model with different architectural designs, e.g. switch policy (Chen et al, 2020b) and content matching (Gong et al, 2020), to address the few-shot table-to-text generation problem.…”
Section: Discussionmentioning
confidence: 99%
“…End-to-End Models. Many existing studies are dedicated to building end-to-end neural models with different strategies like soft-templates (Wiseman et al, 2018;Ye et al, 2020), attention awareness (Liu et al, 2018;Colin and Gardent, 2019), and retrieved prototypes Su et al, 2021b). Gehrmann et al (2018), Puduppully et al (2019a,b), and Chen et al (2020b) adopted copy mechanism for content selection to improve the information coverage of the outputs.…”
Section: Related Workmentioning
confidence: 99%
“…With recent advances in neural networks, researchers have built different neural models based on various strategies, e.g. latentvariables (Wiseman et al, 2018;Ye et al, 2020), structure awareness (Liu et al, 2018;Colin and Gardent, 2019), copy mechanism (Gehrmann et al, 2018;Puduppully et al, 2019a,b), and pre-trained language models (PLMs) (Chen et al, 2020a;Kale, 2020;Ribeiro et al, 2020). More recently, to alleviate the data-hungry nature of neural models, Ma et al (2019) 2020) applied a retrieval model to assist the generation of paraphrased sentence.…”
Section: A Related Workmentioning
confidence: 99%