Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing 2015
DOI: 10.18653/v1/d15-1169
|View full text |Cite
|
Sign up to set email alerts
|

Joint A* CCG Parsing and Semantic Role Labelling

Abstract: Joint models of syntactic and semantic parsing have the potential to improve performance on both tasks-but to date, the best results have been achieved with pipelines. We introduce a joint model using CCG, which is motivated by the close link between CCG syntax and semantics. Semantic roles are recovered by labelling the deep dependency structures produced by the grammar. Furthermore, because CCG is lexicalized, we show it is possible to factor the parsing model over words and introduce a new A * parsing algor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
43
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
6
4

Relationship

2
8

Authors

Journals

citations
Cited by 42 publications
(43 citation statements)
references
References 37 publications
0
43
0
Order By: Relevance
“…To realize this vision in practice, an approach is needed to build a CCG parser enriched with graph semantics for deriving AMRs. We anticipate that existing CCG parsing frameworks can be adapted-for example, by developing an alignment algorithm to induce the semantics for lexical entries from the AMR corpus, and running an off-the-shelf parser like EasySRL (Lewis et al, 2015) at training and test time for the syntactic side of the derivation. This approach would take advantage of the fact that our analysis assumes the ordinary CCG syntax for obtaining the compositional structure of the derivation.…”
Section: Discussionmentioning
confidence: 99%
“…To realize this vision in practice, an approach is needed to build a CCG parser enriched with graph semantics for deriving AMRs. We anticipate that existing CCG parsing frameworks can be adapted-for example, by developing an alignment algorithm to induce the semantics for lexical entries from the AMR corpus, and running an off-the-shelf parser like EasySRL (Lewis et al, 2015) at training and test time for the syntactic side of the derivation. This approach would take advantage of the fact that our analysis assumes the ordinary CCG syntax for obtaining the compositional structure of the derivation.…”
Section: Discussionmentioning
confidence: 99%
“…More ambitiously, the annotation has the potential to be used for training parsers. A joint syntactic and semantic parser, such as that of Lewis et al (2015), could be trained directly on the annotations to improve both the syntactic and semantic models, for example in domain transfer settings. Alternatively, the annotation could be used for active learning: we envisage a scheme where parsers, when faced with ambiguous attachment decisions, can generate a human-readable question whose answer will resolve the attachment.…”
Section: Discussion and Future Workmentioning
confidence: 99%
“…Joint learning in NLP pipelines. To avoid cascading errors, much effort has been devoted to joint decoding in NLP pipelines (Habash and Rambow, 2005;Cohen and Smith, 2007;Goldberg and Tsarfaty, 2008;Lewis et al, 2015;Zhang et al, 2015, inter alia). However, joint inference can sometimes be prohibitively expensive.…”
Section: Related Workmentioning
confidence: 99%