Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1314
|View full text |Cite
|
Sign up to set email alerts
|

Enhancing AMR-to-Text Generation with Dual Graph Representations

Abstract: Generating text from graph-based data, such as Abstract Meaning Representation (AMR), is a challenging task due to the inherent difficulty in how to properly encode the structure of a graph with labeled edges. To address this difficulty, we propose a novel graph-to-sequence model that encodes different but complementary perspectives of the structural information contained in the AMR graph. The model learns parallel top-down and bottom-up representations of nodes capturing contrasting views of the graph. We als… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
48
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 46 publications
(49 citation statements)
references
References 32 publications
1
48
0
Order By: Relevance
“…Parameter Saving Strategy. (Ribeiro et al, 2019). Results are statistically significant with p < 0.05.…”
Section: Development Experimentsmentioning
confidence: 82%
See 2 more Smart Citations
“…Parameter Saving Strategy. (Ribeiro et al, 2019). Results are statistically significant with p < 0.05.…”
Section: Development Experimentsmentioning
confidence: 82%
“…We consider two kinds of baseline models: 1) models based on Recurrent Neural Networks (Konstas et al, 2017;Cao and Clark, 2019) and Graph Neural Networks (GNNs) (Song et al, 2018;Beck et al, 2018;Damonte and Cohen, 2019;Guo et al, 2019b;Ribeiro et al, 2019). These models use an attention-based LSTM decoder.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Initial work on AMR-to-text generation adapted methods from statistical machine translation (MT) (Pourdamghani et al, 2016), grammar-based generation (Mille et al, 2017), tree-to-string transducers (Flanigan et al, 2016), and inverted semantic parsing (Lampouras and Vlachos, 2017). Neural approaches explored sequence-to-sequence models where the AMR is linearized (Konstas et al, 2017) or modeled with a graph encoder (Marcheggiani and Perez-Beltrachini, 2018;Damonte and Cohen, 2019;Ribeiro et al, 2019;Song et al, 2018;Zhu et al, 2019). As professionally-annotated AMR datasets are in English, all this work focuses on English.…”
Section: Related Workmentioning
confidence: 99%
“…Their supervised model is based on message passing through the topology of the incidence graph of the KG input. Such graph neural networks (Kipf and Welling, 2017;Veličković et al, 2018) have been widely adopted in supervised graph-to-text tasks (Beck et al, 2018;Damonte and Cohen, 2019;Ribeiro et al, 2019Ribeiro et al, , 2020.…”
Section: Introductionmentioning
confidence: 99%