2017
DOI: 10.1016/j.engappai.2016.12.012
|View full text |Cite
|
Sign up to set email alerts
|

Whispered speech recognition using deep denoising autoencoder

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
33
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 64 publications
(33 citation statements)
references
References 33 publications
0
33
0
Order By: Relevance
“…Examples of such areas are speech and speaker recognition. MFCCs have proven to outperform other coefficients in the two areas and they have shown to grant a high-level approximation of human auditory perception [17], [26], [27], [28]. In this study, a 32-dimension feature analysis of MFCCs (16 static MFCCs and 16 delta MFCCs) was utilized to found the observation vectors in each of "CSPHMM1s, CSPHMM2s, and CSPHMM3s".…”
Section: Extraction Of Featuresmentioning
confidence: 99%
“…Examples of such areas are speech and speaker recognition. MFCCs have proven to outperform other coefficients in the two areas and they have shown to grant a high-level approximation of human auditory perception [17], [26], [27], [28]. In this study, a 32-dimension feature analysis of MFCCs (16 static MFCCs and 16 delta MFCCs) was utilized to found the observation vectors in each of "CSPHMM1s, CSPHMM2s, and CSPHMM3s".…”
Section: Extraction Of Featuresmentioning
confidence: 99%
“…The use of Teager energy cepstral coefficients with deep denoising autoencoder (DDA) has recently brought many benefits in speaker dependent (SD) neutraltrained whisper recognition [18]. Likewise, performances of speaker independent (SI) recognition of whispered speech have been significantly improved after adapting the acoustic model toward the DDA pseudo-whisper samples, compared to the model adaptation on an available small whisper set (for UT-Vocal Effort II speech corpus) [13,15].…”
Section: Related Workmentioning
confidence: 99%
“…DNN is superior to other conventional neural networks in the classification problem with the help of the aforementioned properties by having complex decision surface. 6,7,[22][23][24][25][26][27][28][29]…”
Section: Introductionmentioning
confidence: 99%