2018
DOI: 10.1080/25742558.2018.1483565
|View full text |Cite
|
Sign up to set email alerts
|

A whale optimization algorithm (WOA) approach for clustering

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
89
0
4

Year Published

2019
2019
2023
2023

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 200 publications
(93 citation statements)
references
References 29 publications
0
89
0
4
Order By: Relevance
“…WOA is considered to be a global optimizer as compared to PSO and GA, which do not guarantee global optima and are considered as a local optimizer. Also, PSO slows down near global optima because the position of particles is mainly dependent on its personal best (Pbest ) and global best (Gbest), thus, increasing its chances of jamming at some local optima instead of the global optima [44]. From Figure 11, we can see that at the base case reactive loading, WOA converged at 0.9995 × 10 7 ($) as compared to PSO and GA, which converged at 1.0714 × 10 7 ($) and 1.012 × 10 7 ($), respectively.…”
Section: Percentagementioning
confidence: 99%
“…WOA is considered to be a global optimizer as compared to PSO and GA, which do not guarantee global optima and are considered as a local optimizer. Also, PSO slows down near global optima because the position of particles is mainly dependent on its personal best (Pbest ) and global best (Gbest), thus, increasing its chances of jamming at some local optima instead of the global optima [44]. From Figure 11, we can see that at the base case reactive loading, WOA converged at 0.9995 × 10 7 ($) as compared to PSO and GA, which converged at 1.0714 × 10 7 ($) and 1.012 × 10 7 ($), respectively.…”
Section: Percentagementioning
confidence: 99%
“…This mechanism is performed by a software agent, referred to as the "Incoming Data Handler" (IDH), whose pseudocode is reported in Algorithm 4 to further clarify this process and allow for its implementation. if merged is false then 6: if m p is density reachable to mc then 7: if new radius ≤ mc then 8: Merge m p with mc 9: else 10: Add m p to p-microclusters 11: end if 12: merged = true 13: end if 14: end if 15: end for 16: if merged is false then 17: for each mc in o-microclusters do 18: if merged is false then 19: if m p is density reachable to mc then 20: if new radius ≤ mc then 21: Merge m p with mc 22: merged = true 23: end if 24: end if 25: end if 26: end for 27: end if 28: if merged is false then 29: Add m p to o-microclusters 30: end if 31: end 32: return…”
Section: Handling Incoming Data Pointsmentioning
confidence: 99%
“…Amongst the most commonly used optimisation paradigms of this kind, it is worth mentioning the established Differential Evolution (DE) framework [8][9][10], as well as more recent nature inspired algorithms from the Swarm Intelligence (SI) field, such as the Whale Optimisation Algorithm (WOA) [11] and the Bat-inspired algorithm in [12], here referred to as BAT. Although the literature is replete with examples of data clustering strategies based on DE, WOA and BAT for the static domain, as, e.g., those presented in [13][14][15][16], little has been done for the dynamic environment due to the difficulties in handling data streams. The current state of dynamic clustering is therefore unsatisfactory as it mainly relies on algorithms based on techniques such as density microclustering and density grid based clustering, which require the tuning of several parameters to work effectively [17].…”
Section: Introductionmentioning
confidence: 99%
“…Three major stages of this algorithm are (a) shrinking encircling hunt, (b) exploitation (i.e., bubble-net attacking), and (c) exploration (i.e., searching for the prey). More information about the WOA can be found in previous studies [37][38][39][40]. The LCA was proposed by Kashan [41], based on sporting competitions in sports leagues.…”
Section: Hybrid Metaheuristic Algorithmsmentioning
confidence: 99%