2013
DOI: 10.1007/s10462-013-9400-4
|View full text |Cite
|
Sign up to set email alerts
|

A review on particle swarm optimization algorithm and its variants to clustering high-dimensional data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
94
0
2

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 283 publications
(96 citation statements)
references
References 90 publications
0
94
0
2
Order By: Relevance
“…The first one is based on bees [Karaboga/Akay, 2009], and the second is based on foraging theory [Stephens/Krebs, 1986]. For clustering tasks, PSO has mainly been applied in hybrid algorithms [Esmin et al, 2015]; e.g., [Van der Merwe/Engelbrecht, 2003] applied PSO combined with k-means clustering. Here, it is argued that the hybridization of PSO and k-means may improve the choice of centroids or may, in some special cases, even allow the problem of the number of clusters to be solved.…”
Section: Swarm Intelligence For Unsupervised Machine Learningmentioning
confidence: 99%
“…The first one is based on bees [Karaboga/Akay, 2009], and the second is based on foraging theory [Stephens/Krebs, 1986]. For clustering tasks, PSO has mainly been applied in hybrid algorithms [Esmin et al, 2015]; e.g., [Van der Merwe/Engelbrecht, 2003] applied PSO combined with k-means clustering. Here, it is argued that the hybridization of PSO and k-means may improve the choice of centroids or may, in some special cases, even allow the problem of the number of clusters to be solved.…”
Section: Swarm Intelligence For Unsupervised Machine Learningmentioning
confidence: 99%
“…Based on the target function that depends on the user's definition, many other mathematical optimization algorithms have been proposed to facilitate the procedure of parameter identification, including the genetic algorithm (GA) method, continuous function optimization method [27][28][29] and their individual advantages and limitations would not be discussed in detail here. In the present study, a particle swarm optimization (PSO) algorithm is implemented for the purpose of parameter identification and the detailed introduction about PSO algorithm can be found in the literature [30]. As for the PSO algorithm, all population members generated in the first trial continuously update their problem solutions by tracking personal Best (pBest), which results in the minimal target function value by comparison with itself, and global Best (gBest), which results in the minimal target function value by comparison with all the population members in each iteration.…”
Section: Parameters Calibrationmentioning
confidence: 99%
“…This results in a partitioning of the data space into Voronoi cells [25]. The cluster centers are substituted for center positions of food sources and the formula of computing the centers is shown in (4). If the th cluster contains members and the members are denoted as 1 , 2 , .…”
Section: K-meansmentioning
confidence: 99%
“…. , } and the total number of iteration T; (3) Calculate fitness values of individuals; (4) Normalize the dimensions of each individual as Algorithm 2; (5) Initialize the fitness tree as Algorithm 1; (6) = 1; = 1; Loop: (7) While ( < ) (8) I f( > ∑ ) (9) Partition the whole population into multiple species based on K-means clustering; = + 1; (10) E n dI f (11) Update the position of individuals according to Eq. (6) and (7); (12) Normalize the dimensions of each individual as Algorithm 2; (13) Insert new individuals into fitness tree as Algorithm 1; (14) Update some individuals from fitness tree; (15) Calculate fitness values of individuals;…”
Section: Discrete Dynamics In Nature and Societymentioning
confidence: 99%
See 1 more Smart Citation