2012 IEEE/WIC/ACM International Conferences on Web Intelligence and Intelligent Agent Technology 2012
DOI: 10.1109/wi-iat.2012.46
|View full text |Cite
|
Sign up to set email alerts
|

Local Tangent Distances for Classification Problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2018
2018
2018
2018

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 13 publications
0
2
0
Order By: Relevance
“…When the data is assumed to lie on a low-dimensional manifold, local tangent hyperplanes are a simple and intuitive approach to enhancing the data set and gaining insight into the manifold structure. Our proposed method is very much related to tangent distance classification (TDC) [14,15,16], which constructs local tangent hyperplanes of the class manifolds, computes the distances between these hyperplanes and the given test sample, and then classifies the test sample to the class with the closest hyperplane. We show in Section 5 that our proposed method's integration of tangent hyperplane basis vectors into the sparse representation framework generally outperforms TDC.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…When the data is assumed to lie on a low-dimensional manifold, local tangent hyperplanes are a simple and intuitive approach to enhancing the data set and gaining insight into the manifold structure. Our proposed method is very much related to tangent distance classification (TDC) [14,15,16], which constructs local tangent hyperplanes of the class manifolds, computes the distances between these hyperplanes and the given test sample, and then classifies the test sample to the class with the closest hyperplane. We show in Section 5 that our proposed method's integration of tangent hyperplane basis vectors into the sparse representation framework generally outperforms TDC.…”
Section: Related Workmentioning
confidence: 99%
“…We compared LPCA-SRC to the original SRC, SRC pruned (a modification of SRC which we explain shortly), two versions of tangent distance classification (our implementations are inspired by Yang et al [16]), locality-sensitive dictionary learning SRC [20], k -nearest neighbors classification, and k -nearest neighbors classification over extended dictionary.…”
Section: Algorithms Comparedmentioning
confidence: 99%