2013
DOI: 10.1016/j.patcog.2013.01.010
|View full text |Cite
|
Sign up to set email alerts
|

Local discriminative distance metrics ensemble learning

Abstract: The ultimate goal of distance metric learning is to incorporate abundant discriminative information to keep all data samples in the same class close and those from different classes separated. Local distance metric methods can preserve discriminative information by considering the neighborhood influence. In this paper, we propose a new local discriminative distance metrics (LDDM) algorithm to learn multiple distance metrics from each training sample (a focal sample) and in the vicinity of that focal sample (fo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
16
0

Year Published

2014
2014
2018
2018

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 57 publications
(16 citation statements)
references
References 31 publications
0
16
0
Order By: Relevance
“…Set 1 Set 2 MCCA [23] 65.50 64.00 MMFA [23] 65.00 64.00 LDDM [20] 66.50 66.00 DMMA [18] 65.50 63.50 MNRML [19] 66.50 65.50 DMML [31] 74.50 70.00 GInCS 75.80 72.20 Table 3. Comparison between the GInCS and other popular methods on the UB KinFace dataset.…”
Section: Methodsmentioning
confidence: 99%
“…Set 1 Set 2 MCCA [23] 65.50 64.00 MMFA [23] 65.00 64.00 LDDM [20] 66.50 66.00 DMMA [18] 65.50 63.50 MNRML [19] 66.50 65.50 DMML [31] 74.50 70.00 GInCS 75.80 72.20 Table 3. Comparison between the GInCS and other popular methods on the UB KinFace dataset.…”
Section: Methodsmentioning
confidence: 99%
“…The existing Mahalanobis metric learning can be classified into two categories [21], global-based and local-based. Global-based methods try to make the samples of same class close and the samples of different class apart by using only pairwise distance constraints.…”
Section: ) Mahalanobis Metric Learningmentioning
confidence: 99%
“…Although they have been successfully applied to various applications ranging from unsupervised segmentation of switching dynamics to face recognition or image fusion [12][13][14], a drawback of this kind of classifier is that their solutions does not take into consideration of the local manifold structure and the potential discriminant information hidden in the data. In order to overcome this drawback of SVM and t-SVM, a modified version of SVM called Laplacian Support Vector Machine (Lap-SVM) [15] based on Manifold Regularization (MR) framework was proposed by combining the locality preserving projections (LPP) [16,17] and SVM. LPP not only can preserve the local manifold structure of the dataset, but also can obtain a linear projection on new test samples for dimensionality reduction.…”
Section: Introductionmentioning
confidence: 99%