2022
DOI: 10.48550/arxiv.2211.00106
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Data-Efficient Cross-Lingual Transfer with Language-Specific Subnetworks

Abstract: Large multilingual language models typically share their parameters across all languages, which enables cross-lingual task transfer, but learning can also be hindered when training updates from different languages are in conflict. In this paper, we propose novel methods for using languagespecific subnetworks, which control crosslingual parameter sharing, to reduce conflicts and increase positive transfer during finetuning. We introduce dynamic subnetworks, which are jointly updated with the model, and we combi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 24 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?