2021
DOI: 10.1155/2021/6666619
|View full text |Cite
|
Sign up to set email alerts
|

An Improved Integrated Clustering Learning Strategy Based on Three‐Stage Affinity Propagation Algorithm with Density Peak Optimization Theory

Abstract: To better reflect the precise clustering results of the data samples with different shapes and densities for affinity propagation clustering algorithm (AP), an improved integrated clustering learning strategy based on three-stage affinity propagation algorithm with density peak optimization theory (DPKT-AP) was proposed in this paper. DPKT-AP combined the ideology of integrated clustering with the AP algorithm, by introducing the density peak theory and k-means algorithm to carry on the three-stage clustering … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 32 publications
0
3
0
Order By: Relevance
“…For a dataset to be clustered, in many cases, users do not care whether it contains outliers, because the outliers themselves are difficult to define, but they definitely want to assign them cluster labels. Wang et al [ 34 ] proposed an improved integrated clustering learning strategy based on three-stage affinity propagation algorithm with density peak optimization theory (DPKT-AP). In the first stage, the clustering center point was selected by density peak clustering.…”
Section: Related Workmentioning
confidence: 99%
“…For a dataset to be clustered, in many cases, users do not care whether it contains outliers, because the outliers themselves are difficult to define, but they definitely want to assign them cluster labels. Wang et al [ 34 ] proposed an improved integrated clustering learning strategy based on three-stage affinity propagation algorithm with density peak optimization theory (DPKT-AP). In the first stage, the clustering center point was selected by density peak clustering.…”
Section: Related Workmentioning
confidence: 99%
“…Furthermore, the damping factor and the preference parameter are fixed by the user. Finally, in [61,62], the AP algorithm was combined with other methods to improve the learning task. For these last two methods, prior knowledge is required for the learning task, the preference parameter is also not adaptive, and the damping factor must also be chosen by the user.…”
Section: Related Studiesmentioning
confidence: 99%
“…The number of identified exemplars (clusters) is influenced by the input preferences values, although it also emerges from the message-passing procedure. The sum of r(i, k) and a(i, k) is used to determine whether or not the corresponding data point is a candidate exemplar k [22], [23]. After a data point has been selected, those placed closer to competing candidate exemplar 𝑘 ′ are assigned to this cluster.…”
Section: K-affinity Propagationmentioning
confidence: 99%