Findings of the Association for Computational Linguistics: EMNLP 2023 2023
DOI: 10.18653/v1/2023.findings-emnlp.829
|View full text |Cite
|
Sign up to set email alerts
|

Macedon: Minimizing Representation Coding Rate Reduction for Cross-Lingual Natural Language Understanding

Haoyu Wang,
Yaqing Wang,
Huaxiu Yao
et al.

Abstract: Cross-lingual natural language understanding (NLU) is one of the fundamental tasks of NLP. The goal is to learn a model which can generalize well on both high-resource and low-resource language data. Recent pre-trained multilingual language models, e.g., multilingual BERT, XLM, have shown impressive performance on cross-lingual NLU tasks. However, such promising results request the use of sufficient training data, which is a difficult condition to satisfy for low-resource language.When the data is limited in t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 31 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?