2019
DOI: 10.1109/jas.2019.1911693
|View full text |Cite
|
Sign up to set email alerts
|

Investigation of knowledge transfer approaches to improve the acoustic modeling of Vietnamese ASR system

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 10 publications
(3 citation statements)
references
References 31 publications
0
3
0
Order By: Relevance
“…A generic ASR model can also be adapted to another narrow domain using DTL. With the help of highresource languages, several knowledge transfer methods are investigated in [129] to overcome the data sparsity problem. The first is the DTL and fine-tuning techniques, which uses a well-trained neural network to initialize the LHN parameters.…”
Section: Cross-language Dtlmentioning
confidence: 99%
See 1 more Smart Citation
“…A generic ASR model can also be adapted to another narrow domain using DTL. With the help of highresource languages, several knowledge transfer methods are investigated in [129] to overcome the data sparsity problem. The first is the DTL and fine-tuning techniques, which uses a well-trained neural network to initialize the LHN parameters.…”
Section: Cross-language Dtlmentioning
confidence: 99%
“…Multilingual training can be thought of as a series of shared hidden layers (SHL) and language-specific layers or classifier layers for various languages. The source model's SHL serve as a feature converter, converting various language features to a common feature space [129]. However, some language-dependent features may exist in the common feature space, which is not a positive factor for cross-lingual knowledge transfer.…”
Section: Adversarial Tl-based Asrmentioning
confidence: 99%
“…In 2014, Yosinsk of Cornell University carried out a study on the portability of deep neural networks based on ImageNet data sets [45]- [46]. The results show that: (1) with the help of transfer learning, it is better to use an existing network than a neural network whose weights are randomly initialized and trained with a small amount of data; (2) finetuning in neural network parameters can achieve better results of transfer learning.…”
Section: Transfer Learningmentioning
confidence: 99%