2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2015
DOI: 10.1109/icassp.2015.7178972
|View full text |Cite
|
Sign up to set email alerts
|

Long short-term memory language models with additive morphological features for automatic speech recognition

Abstract: Models of morphologically rich languages suffer from data sparsity when words are treated as atomic units. Word-based language models cannot transfer knowledge from common word forms to rarer variant forms. Learning a continuous vector representation of each morpheme allows a compositional model to represent a word as the sum of its constituent morphemes' vectors. Rare and unknown words containing common morphemes can thus be represented with greater fidelity despite their sparsity. Our novel neural network la… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2015
2015
2019
2019

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 11 publications
0
3
0
Order By: Relevance
“…Theoretical analysis also helped us better understand the performance of the incremental algorithm. The natural future work is to extend our approach to other advanced NNLMs beyond CBOW and Skip-gram such as dependency RNN (Mirowski and Vlachos 2015) and LSTM or deeper RNN models (Renshaw and Hall 2015). 5 the LORELEI Contract HR0011-15-2-0025 with the US Defense Advanced Research Projects Agency (DARPA).…”
Section: Discussionmentioning
confidence: 99%
“…Theoretical analysis also helped us better understand the performance of the incremental algorithm. The natural future work is to extend our approach to other advanced NNLMs beyond CBOW and Skip-gram such as dependency RNN (Mirowski and Vlachos 2015) and LSTM or deeper RNN models (Renshaw and Hall 2015). 5 the LORELEI Contract HR0011-15-2-0025 with the US Defense Advanced Research Projects Agency (DARPA).…”
Section: Discussionmentioning
confidence: 99%
“…In the former, both local and global features were examined in [9,10]. Here, the local features are related to morpheme [10,11,12] or word [9,13] and the global features are related to sentence [14] or document [3,10,15]. Further, sociosituational settings were also examined for adaptation in [9].…”
Section: Related Workmentioning
confidence: 99%
“…Another way of integrating word compositions into CSLMs is to follow an additive approach, such that, a word can be represented as the sum of vectors representing its morphemes. This approach was explored in LSTM NNLMs trained for a word vocabulary and and speech recognition accuracy was improved for highly inflectional Russian language [16].…”
Section: Introductionmentioning
confidence: 99%