2022
DOI: 10.48550/arxiv.2205.01538
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

SUBS: Subtree Substitution for Compositional Semantic Parsing

Abstract: Although sequence-to-sequence models often achieve good performance in semantic parsing for i.i.d. data, their performance is still inferior in compositional generalization. Several data augmentation methods have been proposed to alleviate this problem. However, prior work only leveraged superficial grammar or rules for data augmentation, which resulted in limited improvement. We propose to use subtree substitution for compositional data augmentation, where we consider subtrees with similar semantic functions … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 16 publications
0
1
0
Order By: Relevance
“…Generally, one way to improve compositional generalizability is to incorporate inductive biases directly to models through moduler models (Dong and Lapata, 2018), symbolic-neural machines , latent variables/intermediate representations (Zheng and Lapata, 2020;, metalearning (Lake, 2019) etc. Another way is to first do data augmentation and then train a model with augmented data (Andreas, 2019;Zhong et al, 2020;Akyürek et al, 2020;Yang et al, 2022b). Pretrained models has also been shown useful for compositional semantic parsing (Oren et al, 2020;Furrer et al, 2020).…”
Section: Compositional Generalization In Semanticmentioning
confidence: 99%
“…Generally, one way to improve compositional generalizability is to incorporate inductive biases directly to models through moduler models (Dong and Lapata, 2018), symbolic-neural machines , latent variables/intermediate representations (Zheng and Lapata, 2020;, metalearning (Lake, 2019) etc. Another way is to first do data augmentation and then train a model with augmented data (Andreas, 2019;Zhong et al, 2020;Akyürek et al, 2020;Yang et al, 2022b). Pretrained models has also been shown useful for compositional semantic parsing (Oren et al, 2020;Furrer et al, 2020).…”
Section: Compositional Generalization In Semanticmentioning
confidence: 99%