2018
DOI: 10.48550/arxiv.1805.05286
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

AMR Parsing as Graph Prediction with Latent Alignment

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
11
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(11 citation statements)
references
References 0 publications
0
11
0
Order By: Relevance
“…In order to produce Wikipedia entries in our AMR graphs, we run a wikification approach as post-processing. We combine the approach of Lyu and Titov (2018) with the entity linking technique of Sil et al (2018).…”
Section: Wikificationmentioning
confidence: 99%
See 2 more Smart Citations
“…In order to produce Wikipedia entries in our AMR graphs, we run a wikification approach as post-processing. We combine the approach of Lyu and Titov (2018) with the entity linking technique of Sil et al (2018).…”
Section: Wikificationmentioning
confidence: 99%
“…During post processing, every node with :name label is looked up in the dictionary and if found, is assigned the corresponding Wikipedia link. This is very similar to the approach of Lyu and Titov (2018). If the node is not found in the dictionary, and the system of Sil et al (2018) produces a Wikipedia link, we use that link.…”
Section: Wikificationmentioning
confidence: 99%
See 1 more Smart Citation
“…They also model inductive biases indirectly through graph re-categorization, detailed in Section 6.1, which requires a name entity recognition system at test time. Re-categorization was proposed in Lyu and Titov (2018), which reformulated alignments as a differentiable permutation problem, interpretable as another form of inductive bias.…”
Section: Related Workmentioning
confidence: 99%
“…Fundamentally, this is because predicting graphs is difficult: every graph has many possible linearizations, so from a probabilistic perspective, the linearization is a latent variable that must be marginalized out (Li et al, 2018). Groschwitz et al (2018) model graphs as trees, interpreted as the (latent) derivation trees of a graph grammar; Lyu and Titov (2018) model graphs with a conditional variant of the classic Erdös and Rényi (1959) model, first predicting an alignment for each node of the output graph, and then predicting, for each pair of nodes, whether there is an edge between them. Buys and Blunsom (2017), Chen et al (2018), and Damonte et al (2017) all model graph generation as a sequence of actions, each aligned to a word in the conditioning sentence.…”
Section: Introductionmentioning
confidence: 99%