Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing 2021
DOI: 10.18653/v1/2021.emnlp-main.472
|View full text |Cite
|
Sign up to set email alerts
|

Frustratingly Simple but Surprisingly Strong: Using Language-Independent Features for Zero-shot Cross-lingual Semantic Parsing

Abstract: The availability of corpora has led to significant advances in training semantic parsers in English. Unfortunately, for languages other than English, annotated data is limited and so is the performance of the developed parsers. Recently, pretrained multilingual models have been proven useful for zero-shot cross-lingual transfer in many NLP tasks. What else does it require to apply a parser trained in English to other languages for zero-shot cross-lingual semantic parsing? Will simple language-independent featu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 30 publications
0
6
0
Order By: Relevance
“…There has been recent interest in evaluating if machine translation is an economic proxy for creating training data in new languages (Sherborne et al, 2020;Moradshahi et al, 2020). Zero-shot approaches to cross-lingual parsing have also been explored using auxiliary training objectives (Yang et al, 2021;Sherborne and Lapata, 2022). Cross-lingual learning has also been gaining traction in the adjacent field of spoken-language understanding (SLU).…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…There has been recent interest in evaluating if machine translation is an economic proxy for creating training data in new languages (Sherborne et al, 2020;Moradshahi et al, 2020). Zero-shot approaches to cross-lingual parsing have also been explored using auxiliary training objectives (Yang et al, 2021;Sherborne and Lapata, 2022). Cross-lingual learning has also been gaining traction in the adjacent field of spoken-language understanding (SLU).…”
Section: Related Workmentioning
confidence: 99%
“…Within cross-lingual semantic parsing, there has been an effort to bootstrap parsers with min-imal data to avoid the cost and labor required to support new languages. Recent proposals include using machine translation to approximate training data for supervised learning (Moradshahi et al, 2020;Sherborne et al, 2020;Nicosia et al, 2021) and zero-shot models, which engineer crosslingual similarity with auxiliary losses (van der Goot et al, 2021;Yang et al, 2021;Sherborne and Lapata, 2022). These shortcuts bypass costly data annotation but present limitations such as ''translationese'' artifacts from machine translation (Koppel and Ordan, 2011) or undesirable domain shift (Sherborne and Lapata, 2022).…”
Section: Introductionmentioning
confidence: 99%
“…Other ways of bootstrapping a semantic parsing requires rules/grammars to synthesize training examples (Xu et al, 2020;Wang et al, 2015;Campagna et al, 2019;Weir et al, 2020;Marzoev et al, 2020;. Yang et al (2021) used language-independent features for zero-shot cross-lingual semantic parsing.…”
Section: Case Studymentioning
confidence: 99%
“…Dependency tree cropping and rotation within sentence was used in low-resource language POS tagging ( Şahin and Steedman, 2019) and dependency parsing (Vania et al, 2019). Dependency tree swapping was explored in low-resource language dependency parsing (Dehouck and Gómez-Rodríguez, 2020), and Universal Dependency features was used for zero-shot cross-lingual semantic parsing (Yang et al, 2021). However, subtree substitution with fine-grained meaning functions has not been examined.…”
Section: Related Workmentioning
confidence: 99%