2020
DOI: 10.1007/s11227-020-03429-1
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble-based clustering of large probabilistic graphs using neighborhood and distance metric learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 38 publications
0
3
0
Order By: Relevance
“…Ensemble clustering aims to produce clustering with higher quality and robustness than any of the many basic clustering algorithms by generating multiple cluster graphs of input graph and then combining their results efficiently (Ji et al, 2021). In this regard, the ENCND algorithm (Danesh et al, 2021) used the ensemble clustering method to partition the probabilistic graphs.…”
Section: Taxonomy Of Probabilistic Graph Clusteringmentioning
confidence: 99%
“…Ensemble clustering aims to produce clustering with higher quality and robustness than any of the many basic clustering algorithms by generating multiple cluster graphs of input graph and then combining their results efficiently (Ji et al, 2021). In this regard, the ENCND algorithm (Danesh et al, 2021) used the ensemble clustering method to partition the probabilistic graphs.…”
Section: Taxonomy Of Probabilistic Graph Clusteringmentioning
confidence: 99%
“…The larger the V pc , the better the clustering effect, and the smaller the V pe , the stronger the clustering effect. At the same time, the performance of the algorithm is evaluated by two clustering validity functions Davies-Bouldin index (DBI) and Dunn index (DI) [30,31]. The quantitative evaluation results are listed in Tab.…”
Section: Matching and Fusing Feature Blocksmentioning
confidence: 99%
“…In real-world complications diverse applications of clustering are revised e.g. in Danesh, Dorrigiv & Yaghmaee (2020). The procedure is effectively cast-off e.g.…”
Section: Introductionmentioning
confidence: 99%