2020
DOI: 10.1007/s10851-020-00980-7
|View full text |Cite
|
Sign up to set email alerts
|

Power Spectral Clustering

Abstract: Spectral clustering is one of the most important image processing tools, especially for image segmentation. This specializes at taking local information such as edge weights and globalizing them. Due to its unsupervised nature, it is widely applicable. However, traditional spectral clustering is O(n 3/2). This poses a challenge, especially given the recent trend of large datasets. In this article, we propose an algorithm by using ideas from Γ −convergence, which is an amalgamation of Maximum Spanning Tree (MST… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
19
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 15 publications
(20 citation statements)
references
References 27 publications
(60 reference statements)
1
19
0
Order By: Relevance
“…The complexity is arrived at by calculating the number of unit calculations done in the algorithm. The complexity of SC-TNF2 is similar to the computational complexities of the Powered Gaussian proposed by Nataliani et al, 2017 [3] and LPCA proposed by Castro et al, 2017 [4]. Besides, the proposed method uses sparse matrices for Laplacian matrix calculation, making it time and space efficient.…”
Section: Results and Analysismentioning
confidence: 64%
See 2 more Smart Citations
“…The complexity is arrived at by calculating the number of unit calculations done in the algorithm. The complexity of SC-TNF2 is similar to the computational complexities of the Powered Gaussian proposed by Nataliani et al, 2017 [3] and LPCA proposed by Castro et al, 2017 [4]. Besides, the proposed method uses sparse matrices for Laplacian matrix calculation, making it time and space efficient.…”
Section: Results and Analysismentioning
confidence: 64%
“…The proposed techniques are compared with the following methods: spectral clustering algorithm (NJW) by Ng et al [19] (2002), Neighbour Propagation (NP) (2012) proposed by Li and Guo [16], Shared Nearest Neighbours (SNN) (2016) proposed by Ye and Sakurai. [31], Powered Gaussian (PG) (2017) by Nataliani et al [18], Spectral clustering using Local PCA (LPCA) [1] (2017), Powered Ratio Cut (PRCUT) [4] (2018). In the experiments conducted, three metrics were used for comparison: Adjusted Rand Index (ARI) [20], Normalized Mutual Information (NMI) [24] and Clustering Error (CE) [11].…”
Section: Results and Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…The above theorem provides an initial step to explain the MST approximation in [31]. It has been shown earlier [12], [13], [19], [23], [24] that the limit of minimizers preserves the essential properties of solutions, thus giving useful results. Theorem 2 states that computing the limit of the minimizers of the relaxed seeded isoperimetric partitioning problem on UMST is same as on the original graph.…”
Section: B Limit Of Minimizers Of Relaxed Seeded Isoperimetric Partimentioning
confidence: 96%
“…The maximum spanning tree is instrumental in doing so. Recall that an MST of a graph G = (V , E, w) is a connected subgraph of G spanning V , with no cycles such that weight of the MST = e ij ∈MST w ij (12) is maximized. The UMST is the weighted graph induced by the union of all the maximum spanning trees.…”
Section: Calculating the Limit Of Minimizersmentioning
confidence: 99%