2017
DOI: 10.1609/aaai.v31i1.10895
|View full text |Cite
|
Sign up to set email alerts
|

Optimal Neighborhood Kernel Clustering with Multiple Kernels

Abstract: Multiple kernel $k$-means (MKKM) aims to improve clustering performance by learning an optimal kernel, which is usually assumed to be a linear combination of a group of pre-specified base kernels. However, we observe that this assumption could: i) cause limited kernel representation capability; and ii) not sufficiently consider the negotiation between the process of learning the optimal kernel and that of clustering, leading to unsatisfying clustering performance. To address these issues, we propose an optimal… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
33
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 77 publications
(33 citation statements)
references
References 6 publications
0
33
0
Order By: Relevance
“…For the post-processing of K * , we perform kernel k-means to obtain the clustering partition and labels whose computational complexity is O(n 3 ). Although the computational complexity of our LSWMKC algorithm is the same as the compared models [14]- [16], [19], [24], [36], [40], [48], [51], its clustering performance exhibits significant improvement as reported in Table II.…”
Section: E Analysis and Extensionsmentioning
confidence: 99%
See 1 more Smart Citation
“…For the post-processing of K * , we perform kernel k-means to obtain the clustering partition and labels whose computational complexity is O(n 3 ). Although the computational complexity of our LSWMKC algorithm is the same as the compared models [14]- [16], [19], [24], [36], [40], [48], [51], its clustering performance exhibits significant improvement as reported in Table II.…”
Section: E Analysis and Extensionsmentioning
confidence: 99%
“…Moreover, to encode the emerging data generated from heterogeneous sources or views, multiple kernel clustering (MKC) provides a flexible and expansive framework for combining a set of kernel matrices since different kernels naturally correspond to different views [12]- [18]. Multiple kernel k-means (MKKM) [19] and various variants are further developed and widely employed in many applications [15], [16], [20]- [23]. Most of the kernel-based algorithms follow a common assumption that all the samples are reliable to exploit the intrinsic structures of data, thus such a globally designed manner equally calculates the pairwise similarities of all samples [15]- [17], [20], [21], [24], [25].…”
Section: Introductionmentioning
confidence: 99%
“…Specifically, Avg-KKM (baseline) obtains the consensus kernel by uniformly combines base kernels and then performs kernel k-means on it. We also select five classical algorithms, including MKKM (Huang, Chuang, and Chen 2011), LMKKM (Gönen and Margolin 2014), MKKM-MR (Liu et al 2016), LKAM (Li et al 2016) and ONKC (Liu et al 2017). Additionally, we choose four most recent methods, i.e.…”
Section: Experiments Settingsmentioning
confidence: 99%
“…Multiple kernel clustering (MKC) aims to extract complementary information from multiple pre-specified kernels and then categorize the data with close patterns or structures into the same cluster (Zhao, Kwok, and Zhang 2009;Kloft, Rückert, and Bartlett 2010;Kloft et al 2011;Yu et al 2011;Huang, Chuang, and Chen 2011;Gönen and Alpaydın 2011;Zhou et al 2015;Han et al 2016;Wang et al 2017b;Zhou et al 2021a;Liu et al 2021a). Due to the ability to mining inherent non-linear information, MKC has been intensively researched and commonly applied to various applications (Liu et al 2016;Liu et al 2017;Bhadra, Kaski, and Rousu 2017;Liu et al 2019;Zhou et al 2019Zhou et al , 2021b.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation