Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics 2020
DOI: 10.18653/v1/2020.acl-main.224
|View full text |Cite
|
Sign up to set email alerts
|

Bridging the Structural Gap Between Encoding and Decoding for Data-To-Text Generation

Abstract: Generating sequential natural language descriptions from graph-structured data (e.g., knowledge graph) is challenging, partly because of the structural differences between the input graph and the output text. Hence, popular sequence-to-sequence models, which require serialized input, are not a natural fit for this task. Graph neural networks, on the other hand, can better encode the input graph but broaden the structural gap between the encoder and decoder, making faithful generation difficult. To narrow this … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
70
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 68 publications
(70 citation statements)
references
References 33 publications
0
70
0
Order By: Relevance
“…Table 2 presents our baselines on the WebNLG data-to-text task. Our cross-entropy model is comparable to the very recent state-of-theart model (Zhao et al, 2020). Further, we present single reward based RL models with ROUGE-L, BLEU, and Entailment score as rewards, which again perform better than our cross-entropy model.…”
Section: Results On Data-to-text Generationmentioning
confidence: 61%
See 1 more Smart Citation
“…Table 2 presents our baselines on the WebNLG data-to-text task. Our cross-entropy model is comparable to the very recent state-of-theart model (Zhao et al, 2020). Further, we present single reward based RL models with ROUGE-L, BLEU, and Entailment score as rewards, which again perform better than our cross-entropy model.…”
Section: Results On Data-to-text Generationmentioning
confidence: 61%
“…Given a set of Resource Description Framework (RDF) triples, 4 the task is to generate a natural language text describing the facts in the RDF data. Following Zhao et al (2020), we serialize and reorder the RDF data as an intermediate planning setup, and feed the plan into a seq2seq model with attention and copy mechanism.…”
Section: Data-to-text Generationmentioning
confidence: 99%
“…Those models (Trisedya et al, 2018;Gong et al, 2019;Shen et al, 2020) usually first lineralize the knowledge graph and then use attention mechanism to generate the description sentences. While the linearization of input graph may sacrifice the inter-dependency inside input graph, some papers (Ribeiro et al, 2019(Ribeiro et al, , 2020aZhao et al, 2020) Category Output use graph encoder such as GCN (Duvenaud et al, 2015) and graph transformer Koncel-Kedziorski et al, 2019) to encode the input graphs. Others (Shen et al, 2020; try to carefully design loss functions to control the generation quality.…”
Section: Related Workmentioning
confidence: 99%
“…Translation models take natural language input and must faithfully decode into natural language output. However, as shown in Zhao et al (2020), bridging the gap between structured input and linear output is a difficult task. In addition, in structured input such as graphs, the input is usually semantically under-specified.…”
Section: Introductionmentioning
confidence: 99%