Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics 2023
DOI: 10.18653/v1/2023.eacl-main.180
|View full text |Cite
|
Sign up to set email alerts
|

Bootstrapping Multilingual Semantic Parsers using Large Language Models

Abhijeet Awasthi,
Nitish Gupta,
Bidisha Samanta
et al.

Abstract: Despite cross-lingual generalization demonstrated by pre-trained multilingual models, the translate-train paradigm of transferring English datasets across multiple languages remains to be a key mechanism for training taskspecific multilingual models. However, for many low-resource languages, the availability of a reliable translation service entails significant amounts of costly human-annotated translation pairs. Further, translation services may continue to be brittle due to domain mismatch between task-speci… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 30 publications
0
1
0
Order By: Relevance
“…In the future, our approach, being task-agnostic, can be applied to more non-trivial tasks, such as generation (Kolluru et al, 2022(Kolluru et al, , 2021, semantic parsing (Awasthi et al, 2023), relation extraction (Rathore et al, 2022Bhartiya et al, 2022), and knowledge graph completion Mittal et al, 2023). Our technique may complement other approaches for morphologically rich languages (Nzeyimana and Rubungo, 2022) and for 7 https://github.com/dair-iitd/ZGUL those with scripts unseen in mBERT (Pfeiffer et al, 2021b).…”
Section: Discussionmentioning
confidence: 99%
“…In the future, our approach, being task-agnostic, can be applied to more non-trivial tasks, such as generation (Kolluru et al, 2022(Kolluru et al, , 2021, semantic parsing (Awasthi et al, 2023), relation extraction (Rathore et al, 2022Bhartiya et al, 2022), and knowledge graph completion Mittal et al, 2023). Our technique may complement other approaches for morphologically rich languages (Nzeyimana and Rubungo, 2022) and for 7 https://github.com/dair-iitd/ZGUL those with scripts unseen in mBERT (Pfeiffer et al, 2021b).…”
Section: Discussionmentioning
confidence: 99%