2021
DOI: 10.1109/access.2021.3129847
|View full text |Cite
|
Sign up to set email alerts
|

Investigation of DNN-HMM and Lattice Free Maximum Mutual Information Approaches for Impaired Speech Recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(1 citation statement)
references
References 34 publications
0
1
0
Order By: Relevance
“…Gupta et al [52] suggested a residual network-based approach for detecting dysarthria severity level based on short speech segments, whereas Latha et al [53] employed deep learning and several acoustic cues to recognize dysarthric speech and generate discernible speech. Vishnika Veni and Chandrakala [54] researched the application of the deep neural network-hidden Markov model and lattice maximum mutual information technique for the successful identification of damaged speech. In [55], the authors suggested a histogram of states-based strategy for learning compact and discriminative embeddings for dysarthric voice detection using the deep neural networkhidden Markov model.…”
Section: Assessing Speech-signal Impairmentsmentioning
confidence: 99%
“…Gupta et al [52] suggested a residual network-based approach for detecting dysarthria severity level based on short speech segments, whereas Latha et al [53] employed deep learning and several acoustic cues to recognize dysarthric speech and generate discernible speech. Vishnika Veni and Chandrakala [54] researched the application of the deep neural network-hidden Markov model and lattice maximum mutual information technique for the successful identification of damaged speech. In [55], the authors suggested a histogram of states-based strategy for learning compact and discriminative embeddings for dysarthric voice detection using the deep neural networkhidden Markov model.…”
Section: Assessing Speech-signal Impairmentsmentioning
confidence: 99%