2020
DOI: 10.1016/j.patcog.2020.107470
|View full text |Cite
|
Sign up to set email alerts
|

Binary coyote optimization algorithm for feature selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
21
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 77 publications
(21 citation statements)
references
References 45 publications
0
21
0
Order By: Relevance
“…Then we make another comparison with all the algorithms presented in ( Too & Mirjalili, 2021 ) from which a total of 9 algorithms is imported. The other mentioned eight well-known algorithms in there are; Binary Dragonfly Algorithm (BDA)( Mirjalili, 2016 ), binary artificial bee colony (BABC) ( He, Xie, Wong, & Wang, 2018 ), binary multiverse optimizer (BMVO) ( Al-Madi, Faris, & Mirjalili, 2019 ), binary particle swarm optimization (BPSO) ( Kennedy & Eberhart, 1997 ), chaotic crow search algorithm (CCSA) ( Sayed, Hassanien, & Azar, 2019 ), binary coyote optimization algorithm (BCOA) ( de Souza, de Macedo, dos Santos Coelho, Pierezan, & Mariani, 2020 ), evolution strategy with covariance matrix adaptation (CMAES) ( Hansen & Kern, 2004 ), and success-history based adaptive differential evolution with linear population size reduction (LSHADE) ( Tanabe & Fukunaga, 2014 ). Table 1 presents the parameters that are used by these algorithms.…”
Section: Resultsmentioning
confidence: 99%
“…Then we make another comparison with all the algorithms presented in ( Too & Mirjalili, 2021 ) from which a total of 9 algorithms is imported. The other mentioned eight well-known algorithms in there are; Binary Dragonfly Algorithm (BDA)( Mirjalili, 2016 ), binary artificial bee colony (BABC) ( He, Xie, Wong, & Wang, 2018 ), binary multiverse optimizer (BMVO) ( Al-Madi, Faris, & Mirjalili, 2019 ), binary particle swarm optimization (BPSO) ( Kennedy & Eberhart, 1997 ), chaotic crow search algorithm (CCSA) ( Sayed, Hassanien, & Azar, 2019 ), binary coyote optimization algorithm (BCOA) ( de Souza, de Macedo, dos Santos Coelho, Pierezan, & Mariani, 2020 ), evolution strategy with covariance matrix adaptation (CMAES) ( Hansen & Kern, 2004 ), and success-history based adaptive differential evolution with linear population size reduction (LSHADE) ( Tanabe & Fukunaga, 2014 ). Table 1 presents the parameters that are used by these algorithms.…”
Section: Resultsmentioning
confidence: 99%
“…The COA optimizer is a swarm intelligence algorithm that considers the social relations of the Canis latrans species and its adaptation to the environment proposed by [41] devoted to solving optimization problems. Therefore, the COA mechanism has been designed based on the social conditions of the coyotes, which means the decision variables x of a global optimization problem [54].…”
Section: Coyote Optimization Algorithmmentioning
confidence: 99%
“…This algorithm has recently been applied in several applications, especially to feature selection [54], tune heavy-duty gas turbine hyperparameters [55], optimal power flow for transmission power networks [56] define networks reconfiguration [57], and for optimal parameter estimation of a proton exchange membrane fuel cell [58]. Due to the promising potentials results, a search of the literature reveals that the COA has not yet been applied for the CEEMD's hyperparameters definition, then it is adopted.…”
Section: Coyote Optimization Algorithmmentioning
confidence: 99%
“…Although the wrapper-based methods are computationally expensive and their performance depends on the utilized learning algorithm, they are usually more accurate than the other two categories [13]. The wrapper-based methods use different search approaches such as exhaustive, random, greedy, heuristic, and metaheuristic [14], which, except for the last search approach, are impractical to select effective features from medium and large datasets [15]. Thus, a wide range of metaheuristic optimization algorithms is proposed to solve the feature selection problems for applications with large datasets such as medicine [16].…”
Section: Introductionmentioning
confidence: 99%