2019
DOI: 10.1109/tbdata.2018.2797977
|View full text |Cite
|
Sign up to set email alerts
|

Kernel-Induced Label Propagation by Mapping for Semi-Supervised Classification

Abstract: Kernel methods have been successfully applied to the areas of pattern recognition and data mining. In this paper, we mainly discuss the issue of propagating labels in kernel space. A Kernel-Induced Label Propagation (Kernel-LP) framework by mapping is proposed for high-dimensional data classification using the most informative patterns of data in kernel space. The essence of Kernel-LP is to perform joint label propagation and adaptive weight learning in a transformed kernel space. That is, our Kernel-LP change… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 33 publications
(8 citation statements)
references
References 41 publications
0
8
0
Order By: Relevance
“…Here, q = 1 → Q and we set Q = 3 as three different models are utilized to describe the set data. In order to aggregate such heterogeneous feature representations, three well-studied Riemannian kernel functions are then applied for high dimensional feature embeddings in the light of the proven success of kernel learning [43], [44], [45]. This process is implemented by mapping the original Riemannian manifoldvalued features into a Hilbert space, and then computing the dot product in it.…”
Section: B Multi-kernel Metric Learning For Heterogeneous Features Fmentioning
confidence: 99%
“…Here, q = 1 → Q and we set Q = 3 as three different models are utilized to describe the set data. In order to aggregate such heterogeneous feature representations, three well-studied Riemannian kernel functions are then applied for high dimensional feature embeddings in the light of the proven success of kernel learning [43], [44], [45]. This process is implemented by mapping the original Riemannian manifoldvalued features into a Hilbert space, and then computing the dot product in it.…”
Section: B Multi-kernel Metric Learning For Heterogeneous Features Fmentioning
confidence: 99%
“…Typical examples are label propagation (LP) and label spreading (LS). In LP and LS, kernel derivation by the mapping is performed to classify high-dimensional data [14]. The kernels used are the radial basis function (RBF) and k-nearest neighbor (KNN) algorithms, RBF is also called Gaussian kernels.…”
Section: Semi-supervised Learningmentioning
confidence: 99%
“…Based on the spectral graph theory, the connected points should be as close as possible in the latent common space. Following (Belkin and Niyogi 2003;Zhang et al 2018c), a reasonable criterion for choosing the transformations is to minimize the following objective function:…”
Section: Objective Functionmentioning
confidence: 99%
“…To use the label information, some supervised and semi-supervised methods are proposed to preserve the discrimination into the latent common space (Kan et al 2016;Zhang et al 2018b). Although supervised cross-modal methods have achieved promising performance by utilizing the label information, they entirely rely on the labeled data and have faced two problems: 1) it is time and cost-prohibitive to collect well-annotated data (Zhang et al 2018c;. Especially, considering the multimedia data, such a task is more undesirable since multiple modalities will remarkably increase the labeling workload; 2) a large number of unlabeled data are much easier to obtain, but they cannot be used by the supervised approaches.…”
Section: Introductionmentioning
confidence: 99%