2021
DOI: 10.1609/aaai.v35i16.17661
|View full text |Cite
|
Sign up to set email alerts
|

Adversarial Meta Sampling for Multilingual Low-Resource Speech Recognition

Abstract: Low-resource automatic speech recognition (ASR) is challenging, as the low-resource target language data cannot well train an ASR model. To solve this issue, meta-learning formulates ASR for each source language into many small ASR tasks and meta-learns a model initialization on all tasks from different source languages to access fast adaptation on unseen target languages. However, for different source languages, the quantity and difficulty vary greatly because of their different data scales and diverse phon… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 20 publications
(1 citation statement)
references
References 30 publications
0
1
0
Order By: Relevance
“…This process is known as meta-sampling. It has been proposed in various application domains, including computer vision [26], [27] and speech recognition [28], to extract a training dataset that is tailored to the given task. In the context of GML, meta-sampling presents an opportunity to optimize training models on large KGs by selecting a representative sub-graph that is relevant to the task.…”
Section: A Automatic Training: Methods Selection and Meta-samplingmentioning
confidence: 99%
“…This process is known as meta-sampling. It has been proposed in various application domains, including computer vision [26], [27] and speech recognition [28], to extract a training dataset that is tailored to the given task. In the context of GML, meta-sampling presents an opportunity to optimize training models on large KGs by selecting a representative sub-graph that is relevant to the task.…”
Section: A Automatic Training: Methods Selection and Meta-samplingmentioning
confidence: 99%