2021
DOI: 10.18201/ijisae.2021473641
|View full text |Cite
|
Sign up to set email alerts
|

Punjabi Emotional Speech Database:Design, Recording and Verification

Abstract: This paper introduces Punjabi Emotional Speech Database that has been created to evaluate the recognition of emotions in speech, by the humans and the computer system. The database has been designed, recorded and verified using various standards. The results set a standard for identifying emotions from Punjabi speech. Six emotions are simulated for the collection of speech corpus, including happy, sad, fear, anger, neutral and surprise. 15 speakers, with age group 20-45 years have participated in the recording… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
3

Relationship

1
9

Authors

Journals

citations
Cited by 13 publications
(6 citation statements)
references
References 11 publications
0
6
0
Order By: Relevance
“…A Punjabi speech database was created by Kaur and Singh [32] consisting of 900 emotional speech utterances by 15 speakers. Happiness, fearful, angry, surprised, sad, and neutral are the six emotions present in the database.…”
Section: Emotional Speech Databases For Indo-aryan and Dravidianmentioning
confidence: 99%
“…A Punjabi speech database was created by Kaur and Singh [32] consisting of 900 emotional speech utterances by 15 speakers. Happiness, fearful, angry, surprised, sad, and neutral are the six emotions present in the database.…”
Section: Emotional Speech Databases For Indo-aryan and Dravidianmentioning
confidence: 99%
“…The database was designed, recorded and verified using various standards. 10 sentences in Punjabi language were recorded in each of the six basic emotions, specifically including happy, neutral, anger, fear, sad and surprise, with a total of 900 utterances [34].…”
Section: Punjabi Emotional Speech Corpus Preparationmentioning
confidence: 99%
“…Deep learning (DL) forms more abstract high-level features by combining low-level features and nonlinear transformation, so that the model can better learn the distribution law of data [23] Xiaodan both put forward tree-like LSTM in 2015. The use of LSTM effectively solved the problem of disappearance and explosion in the process of RNN gradient return, and further optimized the depth model.…”
Section: Research Status Of Deep Neural Networkmentioning
confidence: 99%