2021
DOI: 10.1162/tacl_a_00396
|View full text |Cite
|
Sign up to set email alerts
|

Joint Universal Syntactic and Semantic Parsing

Abstract: While numerous attempts have been made to jointly parse syntax and semantics, high performance in one domain typically comes at the price of performance in the other. This trade-off contradicts the large body of research focusing on the rich interactions at the syntax–semantics interface. We explore multiple model architectures that allow us to exploit the rich syntactic and semantic annotations contained in the Universal Decompositional Semantics (UDS) dataset, jointly parsing Universal Dependencies and UDS t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 41 publications
0
2
0
Order By: Relevance
“…We explore both a sequence-to-sequence (seq2seq) model and a sequence-to-graph (seq2graph) model, using the MISO framework (Zhang et al, 2019b;Stengel-Eskin et al, 2021), which is built on top of AllenNLP (Gardner et al, 2018). The former directly predicts the Lisp string, while the latter produces a DAG as seen at the bottom of Fig.…”
Section: Semantic Parsingmentioning
confidence: 99%
See 1 more Smart Citation
“…We explore both a sequence-to-sequence (seq2seq) model and a sequence-to-graph (seq2graph) model, using the MISO framework (Zhang et al, 2019b;Stengel-Eskin et al, 2021), which is built on top of AllenNLP (Gardner et al, 2018). The former directly predicts the Lisp string, while the latter produces a DAG as seen at the bottom of Fig.…”
Section: Semantic Parsingmentioning
confidence: 99%
“…Transformer-based transductive A more competitive approach follows the transductive parsing paradigm (Zhang et al, 2019a), which aims to directly produce the underlying DAG instead of the surface form, generating graph nodes as well as edges. We implement a transformer-based transductive model based on the architecture and code from Stengel-Eskin et al (2021). The model directly generates the linearized DAG (cf.…”
Section: Lstm Seq2seqmentioning
confidence: 99%