Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) 2020
DOI: 10.18653/v1/2020.emnlp-main.92
|View full text |Cite
|
Sign up to set email alerts
|

Online Back-Parsing for AMR-to-Text Generation

Abstract: AMR-to-text generation aims to recover a text containing the same meaning as an input AMR graph. Current research develops increasingly powerful graph encoders to better represent AMR graphs, with decoders based on standard language modeling being used to generate outputs. We propose a decoder that back predicts projected AMR graphs on the target sentence during text generation. As the result, our outputs can better preserve the input meaning than standard decoders. Experiments on two AMR benchmarks show the s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 14 publications
(14 citation statements)
references
References 32 publications
0
14
0
Order By: Relevance
“…Prior work, involving both GNNs and pretrained linearized models, has explored various ways of improving models' sensitivity to the structure of the input graph. To better maintain fidelity to the graph, previous graph-to-text methods incorporate additional loss terms, specialized architectures, or generation-time ranking to influence the semantic accuracy of generation: ranking outputs by the correctness of the AMR parse (Mager et al, 2020;Harkous et al, 2020), jointly "back-parsing" graphs when decoding (Bai et al, 2020), or using distinct components to model different graph traversals (Ribeiro et al, 2019).…”
Section: Rq2: Better Implicit Graph Encodings With Text-to-text Scaffoldingmentioning
confidence: 99%
“…Prior work, involving both GNNs and pretrained linearized models, has explored various ways of improving models' sensitivity to the structure of the input graph. To better maintain fidelity to the graph, previous graph-to-text methods incorporate additional loss terms, specialized architectures, or generation-time ranking to influence the semantic accuracy of generation: ranking outputs by the correctness of the AMR parse (Mager et al, 2020;Harkous et al, 2020), jointly "back-parsing" graphs when decoding (Bai et al, 2020), or using distinct components to model different graph traversals (Ribeiro et al, 2019).…”
Section: Rq2: Better Implicit Graph Encodings With Text-to-text Scaffoldingmentioning
confidence: 99%
“…However, seq2seq approaches tend to lose structural information in AMR graphs since they simply linearize AMR graphs into sequences before feeding them into the models. To prevent information loss caused by linearization, a variety of graph-tosequence approaches have been proposed to better model structural information (Song et al, 2018;Beck et al, 2018;Damonte and Cohen, 2019;Guo et al, 2019;Ribeiro et al, 2019;Cai and Lam, 2020b;Zhao et al, 2020;Yao et al, 2020;Bai et al, 2020). By taking advantages of strong pre-trained language models, recent studies achieve new state of the art (Mager et al, 2020;Harkous et al, 2020;Ribeiro et al, 2020;Bevilacqua et al, 2021) .…”
Section: Related Workmentioning
confidence: 99%
“…machine translation (Song et al, 2019). Both statistical (Flanigan et al, 2016;Pourdamghani et al, 2016) and neural methods (Bai et al, 2020;Cai and Lam, 2020) have been investigated for AMRto-text generation, and recent methods make use of graph neural networks (GNNs) (Kipf and Welling, 2017) or Transformers (Vaswani et al, 2017) for representing the input graph.…”
Section: Amr Graphmentioning
confidence: 99%