2020
DOI: 10.1007/s00500-020-05247-2
|View full text |Cite
|
Sign up to set email alerts
|

Performance-enhanced rough $$k$$-means clustering algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0
2

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 57 publications
0
4
0
2
Order By: Relevance
“…Multiple rough clustering approaches have been considered in the literature [52,53,62,69,70,76,98], see also the section on clustering in the review by Bello and Falcon [6]. Nonetheless, the rough clustering model itself has been formalized by Lingras and Peters in [53], using the following three axioms: (RC1) Any instance x belongs to at most one lower approximation:…”
Section: Rough Clusteringmentioning
confidence: 99%
“…Multiple rough clustering approaches have been considered in the literature [52,53,62,69,70,76,98], see also the section on clustering in the review by Bello and Falcon [6]. Nonetheless, the rough clustering model itself has been formalized by Lingras and Peters in [53], using the following three axioms: (RC1) Any instance x belongs to at most one lower approximation:…”
Section: Rough Clusteringmentioning
confidence: 99%
“…And the goals are to improve the survival rate of nodes, save energy consumption and make the execution of UAVs tasks more flexible. Table 3 compares the application conditions of CHSA algorithm and other two mainstream algorithms, the ant colony algorithm and K-means algorithm.Although some algorithms have been improved based on the two algorithms [16] , [17] , the most common algorithms for cluster head selection without base station are ant colony algorithm and K-means algorithm [18] , and the comparison of these algorithms in node survival rate, energy consumption and other attributes are detailed in Section 5 , and will not be repeated here.
…”
Section: Description Of Algorithmmentioning
confidence: 99%
“…Add the sum of squares within the clusters of all clusters contained in the data set to get the sum of squares, as shown in the following formula 3 The smaller the total sum of squares, the more similar the samples in each cluster, the better the cluster effect [12]. When the total sum of squares (formula 4 ) .…”
Section: 41k-means Clusteringmentioning
confidence: 99%