2018
DOI: 10.1504/ijbic.2018.094616
|View full text |Cite
|
Sign up to set email alerts
|

Optimisation inspiring from behaviour of raining in nature: droplet optimisation algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 17 publications
(5 citation statements)
references
References 0 publications
0
5
0
Order By: Relevance
“…The clustering ensemble approaches with homogenous clustering algorithms employ a same clustering algorithm during generation of the ensemble pool, that is, all partitions of the ensemble pool are generated by a same clustering algorithm. The partitions of the ensemble pool in homogenous clustering algorithms can be produced by one of the following subtypes: by employing different initializations of a given clustering algorithm , by employing different parameters (like different numbers of clusters) for data clustering using a same clustering algorithm , by employing different data projections for data clustering using a same clustering algorithm , by employing different subsets of dataset features for data clustering using a same clustering algorithm , by employing meta heuristic algorithms for data clustering , and by employing different datasets for data clustering using a same clustering algorithm. …”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…The clustering ensemble approaches with homogenous clustering algorithms employ a same clustering algorithm during generation of the ensemble pool, that is, all partitions of the ensemble pool are generated by a same clustering algorithm. The partitions of the ensemble pool in homogenous clustering algorithms can be produced by one of the following subtypes: by employing different initializations of a given clustering algorithm , by employing different parameters (like different numbers of clusters) for data clustering using a same clustering algorithm , by employing different data projections for data clustering using a same clustering algorithm , by employing different subsets of dataset features for data clustering using a same clustering algorithm , by employing meta heuristic algorithms for data clustering , and by employing different datasets for data clustering using a same clustering algorithm. …”
Section: Related Workmentioning
confidence: 99%
“…The clustering ensemble approaches with homogenous clustering algorithms employ a same clustering algorithm during generation of the ensemble pool, that is, all partitions of the ensemble pool are generated by a same clustering algorithm. The partitions of the ensemble pool in homogenous clustering algorithms can be produced by one of the following subtypes: [24,34,38,39,43,52] for data clustering [42], and 6. by employing different datasets for data clustering using a same clustering algorithm.…”
Section: Ensemble Generation Problemmentioning
confidence: 99%
“…In recent years, a large number of bio‐inspired optimization algorithms have been proposed, such as genetic algorithm (GA), bat algorithm (BA), differential evolution (DE), firefly algorithm (FA), and so on, which have been applied to various fields, including the malicious code detection, LEACH protocol optimization, privacy preservation, intrusion detection system, job scheduling problem, and so on . Hence, more scholars began to use them to decrease the error of the DVHop positioning algorithm because of their good global convergence and high robustness.…”
Section: Introductionmentioning
confidence: 99%
“…e algorithm is simple and convenient. rough the comparison and analysis with some of the latest optimization algorithms, it is found that the droplet optimization algorithm is better than some of the latest optimization algorithms [10]. Omidvar et al, based on the random behavior of natural phenomena and the behavior of chicks, constructed a See-See Partridge Chicks Optimization (SSPCO) algorithm and compared it with the latest optimization algorithm.…”
Section: Introductionmentioning
confidence: 99%