2018
DOI: 10.1007/s11042-018-5917-5
|View full text |Cite
|
Sign up to set email alerts
|

Training deep neural networks with non-uniform frame-level cost function for automatic speech recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
6
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 21 publications
(6 citation statements)
references
References 63 publications
0
6
0
Order By: Relevance
“…It is hoped that readers involved in the many applications of entropy to the assessment of uncertainty may be intrigued to consider the relevance of extropy to their deliberations. One such application to methods of automatic speech recognition has already appeared in [ 18 ]. Several implications for the analysis of order statistics have been discussed in [ 19 , 20 , 21 ].…”
Section: Discussionmentioning
confidence: 99%
“…It is hoped that readers involved in the many applications of entropy to the assessment of uncertainty may be intrigued to consider the relevance of extropy to their deliberations. One such application to methods of automatic speech recognition has already appeared in [ 18 ]. Several implications for the analysis of order statistics have been discussed in [ 19 , 20 , 21 ].…”
Section: Discussionmentioning
confidence: 99%
“…One can refer to [5] for the extropy properties of order statistics and record values. The applications of extropy in automatic speech recognition can be found in [6]. Various literature sources have presented a range of extropy measures and their extensions.…”
Section: Introductionmentioning
confidence: 99%
“…The concept of extropy is useful in many fields: for instance, it is applied in automatic speech recognition [4]. In particular, the extropy of a network output with respect to the training set can be obtained in order to compute a kind of transformed cross entropy.…”
Section: Introductionmentioning
confidence: 99%