2014
DOI: 10.1016/j.amc.2013.12.139
|View full text |Cite
|
Sign up to set email alerts
|

Hybridizing Harmony Search algorithm with different mutation operators for continuous problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 33 publications
(5 citation statements)
references
References 20 publications
0
5
0
Order By: Relevance
“…There are several mutation techniques that can be used in genetic algorithms like random mutation [ 18 ], boundary mutation [ 18 ], nonuniform random mutation [ 19 ], power mutation [ 17 ], polynomial mutation [ 18 ], and so forth. The experimental evaluation of the HS algorithm using different mutation methods using a well-known set of test functions shows that using the polynomial mutation improves the performance of the algorithm significantly for a considerable number of functions [ 20 ]. Deb and Agrawal [ 18 ] suggested a polynomial mutation operator with a user-defined index parameter ( η m ).…”
Section: Harmony Search Algorithmsmentioning
confidence: 99%
“…There are several mutation techniques that can be used in genetic algorithms like random mutation [ 18 ], boundary mutation [ 18 ], nonuniform random mutation [ 19 ], power mutation [ 17 ], polynomial mutation [ 18 ], and so forth. The experimental evaluation of the HS algorithm using different mutation methods using a well-known set of test functions shows that using the polynomial mutation improves the performance of the algorithm significantly for a considerable number of functions [ 20 ]. Deb and Agrawal [ 18 ] suggested a polynomial mutation operator with a user-defined index parameter ( η m ).…”
Section: Harmony Search Algorithmsmentioning
confidence: 99%
“…The control parameters of i CSPM2 were as follows: L = 1 in the Lévy flight method, PAR=0.3 in the PA method and the fraction of elite solutions p b was set to a value equal to the value of p a . The 15 benchmark functions described in Table 2 are well-recognized functions that have frequently been used to evaluate the efficiency of optimization algorithms [31,34]. These functions were used in Section 5.3 to test the sensitivity of i CSPM and i CSPM2 to the island model parameters.…”
Section: Algorithm 1 I Cspm With Elite Opposition-based Learning and Multiple Mutation Methods (I Cspm2)mentioning
confidence: 99%
“…Secondly, it uses CS with with HDP mutation (CS10) [2] to optimize the candidate solutions on each island. This is because, according to several experimental studies [31][32][33], CS10 is a better exploratory algorithm than the CS algorithm.…”
Section: Cuckoo Search With Jaya Mutationmentioning
confidence: 99%
“…To overcome this, a modified version of PSO is proposed. In PSLPSO, a polynomial self-adaptation inspired by a polynomial mutation operator [27] is employed to improve the performance, which is used to extend the exploration capability of the swarm. Also, to be precise, the value of self-learning rate P sl adapts the diversity of particles.…”
Section: Polynomial Self Learning Psomentioning
confidence: 99%