ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2020
DOI: 10.1109/icassp40776.2020.9054468
|View full text |Cite
|
Sign up to set email alerts
|

Cross Lingual Transfer Learning for Zero-Resource Domain Adaptation

Abstract: We propose a method for zero-resource domain adaptation of DNN acoustic models, for use in low-resource situations where the only in-language training data available may be poorly matched to the intended target domain. Our method uses a multi-lingual model in which several DNN layers are shared between languages. This architecture enables domain adaptation transforms learned for one well-resourced language to be applied to an entirely different lowresource language. First, to develop the technique we use Engli… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
4
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 15 publications
(5 citation statements)
references
References 22 publications
0
4
0
1
Order By: Relevance
“…In vision study, although the images from training distribution and test distribution can be sufficiently different, the train and test distributions mostly share the same support (the pixels are always sampled from a 0-255 integer space), although the density of these distributions can be very different (e.g., photos vs. sketches). On the other hand, domain adaptation of NLP sometimes studies the regime where the supports of the data differ, e.g., the vocabularies can be significantly different in cross-lingual studies (Abad et al, 2020;Zhang et al, 2020a).…”
Section: Continuous Vs Discrete In Search Spacementioning
confidence: 99%
“…In vision study, although the images from training distribution and test distribution can be sufficiently different, the train and test distributions mostly share the same support (the pixels are always sampled from a 0-255 integer space), although the density of these distributions can be very different (e.g., photos vs. sketches). On the other hand, domain adaptation of NLP sometimes studies the regime where the supports of the data differ, e.g., the vocabularies can be significantly different in cross-lingual studies (Abad et al, 2020;Zhang et al, 2020a).…”
Section: Continuous Vs Discrete In Search Spacementioning
confidence: 99%
“…Pour améliorer cet état de fait, l'approche naturelle est d'adapter un modèle entraîné sur une quantité bien plus grande de parole d'adultes, en se servant de ces 13 heures comme données d'adaptation. Il s'agit de la méthode d'apprentissage par transfert (Transfer Learning, TL), très utilisée en apprentissage profond en général (Abad et al, 2020;Duan et al, 2020). Nous suivons les recommandations de (Shivakumar & Georgiou, 2020), où les auteurs suggèrent, pour de très jeunes enfants (5-8 ans), d'appliquer le TL sur l'ensemble des couches du modèle source.…”
Section: Apprentissage Par Transfertunclassified
“…In vision study, although the images from training distribution and test distribution can be sufficiently different, the train and test distributions mostly share the same support (the pixels are always sample from a 0-255 integer space), although the density of these distributions can be very different (photos vs. sketches). On the other hand, domain adaptation of NLP sometimes studies the regime where the supports of the data differ (e.g., the vocabularies can be significantly different in cross-lingual study (Abad et al, 2020;Zhang et al, 2020a)).…”
Section: Continuous Vs Discrete In Search Spacementioning
confidence: 99%