2021
DOI: 10.48550/arxiv.2103.09120
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Structural Adapters in Pretrained Language Models for AMR-to-text Generation

Abstract: Previous work on text generation from graphstructured data relies on pretrained language models (PLMs) and utilizes graph linearization heuristics rather than explicitly considering the graph structure. Efficiently encoding the graph structure in PLMs is challenging because they were pretrained on natural language, and modeling structured data may lead to catastrophic forgetting of distributional knowledge. In this paper, we propose STRUC-TADAPT, an adapter method to encode graph structure into PLMs. Contrary … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 25 publications
(41 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?