2013
DOI: 10.1016/j.jocs.2011.11.004
|View full text |Cite
|
Sign up to set email alerts
|

Competitive clustering algorithms based on ultrametric properties

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 13 publications
(7 citation statements)
references
References 8 publications
0
7
0
Order By: Relevance
“…Indeed, in Ref. , a recursive clustering method was proposed to elect Cnodes, based on two clustering algorithms: LEACH, which is common clustering networks, and Fast and Flexible Clustering Algorithm (FFUCA Algorithm) Recursive_Leach (R‐Leach): LEACH uses the clustering technique that divides a network into subsets called clusters, where each cluster is formed of CH and nodes which belong to it.…”
Section: Numerical Resultsmentioning
confidence: 99%
“…Indeed, in Ref. , a recursive clustering method was proposed to elect Cnodes, based on two clustering algorithms: LEACH, which is common clustering networks, and Fast and Flexible Clustering Algorithm (FFUCA Algorithm) Recursive_Leach (R‐Leach): LEACH uses the clustering technique that divides a network into subsets called clusters, where each cluster is formed of CH and nodes which belong to it.…”
Section: Numerical Resultsmentioning
confidence: 99%
“…Another approach consists in a different clustering method. The number of cluster is not set and all the number of clusters are dynamically updated like a competitive agglomeration clustering [3] or [5].…”
Section: Discussionmentioning
confidence: 99%
“…In order to remedy this effect, Murtagh defines an ultrametricity index which overcomes the chaining effect as the ratio between the number of triangles which are almost isosceles and the number of all triangles in the data [55]. Furthermore, a clustering based on ultrametric properties is described in [61]. The derived feature vectors serve as input for classification.…”
Section: Dimensionality Reduction (Dr) Vs Feature Selection (Fs)mentioning
confidence: 99%