2010
DOI: 10.1007/s10772-010-9070-4
|View full text |Cite
|
Sign up to set email alerts
|

Genetic algorithm based simultaneous optimization of feature subsets and hidden Markov model parameters for discrimination between speech and non-speech events

Abstract: Feature subsets and hidden Markov model (HMM) parameters are the two major factors that affect the classification accuracy (CA) of the HMM-based classifier. This paper proposes a genetic algorithm based approach for simultaneously optimizing both feature subsets and HMM parameters with the aim to obtain the best HMM-based classifier. Experimental data extracted from three spontaneous speech corpora were used to evaluate the effectiveness of the proposed approach and the three other approaches (i.e. the approac… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2010
2010
2017
2017

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(7 citation statements)
references
References 24 publications
0
7
0
Order By: Relevance
“…GA is an optimization algorithm that simulates Darwin's genetic choice and biological natural selection process. (38,39) It uses a coding scheme to realize parameter optimization through iterative genetic operations, including selection, crossover, and mutation, where a fitness function is usually defined to find the optimal solution. For HMM parameter initialization in this study, floating point number coding is used as the coding scheme due to the high precision requirement when the mixed Gaussian probability density function is employed for generating the observation sequence.…”
Section: Ga Initializationmentioning
confidence: 99%
“…GA is an optimization algorithm that simulates Darwin's genetic choice and biological natural selection process. (38,39) It uses a coding scheme to realize parameter optimization through iterative genetic operations, including selection, crossover, and mutation, where a fitness function is usually defined to find the optimal solution. For HMM parameter initialization in this study, floating point number coding is used as the coding scheme due to the high precision requirement when the mixed Gaussian probability density function is employed for generating the observation sequence.…”
Section: Ga Initializationmentioning
confidence: 99%
“…For example, a methodology for learning specialised filter banks using deep neural networks was proposed in [6]. Moreover, several approaches based on evolutionary computation have been proposed for the search of optimal speech representations [7][8][9][10].…”
Section: Introductionmentioning
confidence: 99%
“…On the other hand, significant progress has been made with the application of different artificial intelligence techniques for feature selection. In particular, many works rely on evolutionary algorithms for feature subset optimization [10], [11], and for the search of optimal representations [12], [13], [14].…”
Section: Introductionmentioning
confidence: 99%