ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2021
DOI: 10.1109/icassp39728.2021.9414959
|View full text |Cite
|
Sign up to set email alerts
|

Meta-Adapter: Efficient Cross-Lingual Adaptation With Meta-Learning

Abstract: Transfer learning from a multilingual model has shown favorable results on low-resource automatic speech recognition (ASR). However, full-model fine-tuning generates a separate model for every target language and is not suitable for deploying and maintaining in production. The key challenge lies in how to efficiently extend the pre-trained model with fewer parameters. In this paper, we propose to combine the adapter module with meta-learning algorithms to achieve high recognition performance under low-resource… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
13
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 13 publications
(13 citation statements)
references
References 14 publications
0
13
0
Order By: Relevance
“…Besides learning the language-agnostic features, the optimization-based meta-learning approaches [13], [26] that aim to find a proper initialization for rapid adaptation have also been explored for cross-lingual ASR [9]. Hsu et al [8] proposed to apply the model-agnostic meta-learning (MAML) [13] as the pre-training method and achieved significant improvement over the conventional multilingual pretraining baseline.…”
Section: A Multilingual and Cross-lingual Speech Recognitionmentioning
confidence: 99%
See 4 more Smart Citations
“…Besides learning the language-agnostic features, the optimization-based meta-learning approaches [13], [26] that aim to find a proper initialization for rapid adaptation have also been explored for cross-lingual ASR [9]. Hsu et al [8] proposed to apply the model-agnostic meta-learning (MAML) [13] as the pre-training method and achieved significant improvement over the conventional multilingual pretraining baseline.…”
Section: A Multilingual and Cross-lingual Speech Recognitionmentioning
confidence: 99%
“…Due to the limited training data in low-resource languages, direct re-training makes the model easily overfit. These problems make the transfer-based methods inefficient [9], [10]. Recently, the adapter module was proposed for parameter-efficient fine-tuning in multilingual or cross-lingual settings [9]- [11], which can mitigate overfitting.…”
mentioning
confidence: 99%
See 3 more Smart Citations