2007
DOI: 10.1016/j.patcog.2006.05.033
|View full text |Cite
|
Sign up to set email alerts
|

Kernel clustering-based discriminant analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
14
0

Year Published

2009
2009
2015
2015

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 34 publications
(14 citation statements)
references
References 4 publications
0
14
0
Order By: Relevance
“…These techniques can be extended to nonlinear methods by means of kernelization [19,2]. Another principled way to extend dimensionality reducing data visualization to auxiliary information is offered by an adaptation of the underlying metric.…”
Section: Introductionmentioning
confidence: 99%
“…These techniques can be extended to nonlinear methods by means of kernelization [19,2]. Another principled way to extend dimensionality reducing data visualization to auxiliary information is offered by an adaptation of the underlying metric.…”
Section: Introductionmentioning
confidence: 99%
“…The main idea is to firstly map the data from the initial space to a high-dimensional Hilbert space, where they might be linearly separable and then use a linear subspace method. This approach results to the kernelized versions of the linear techniques, that have already been developed, i.e., Kernel Principal Component Analysis (KPCA) [23], Kernel Discriminant Analysis (KDA) [24], Kernel Clustering Discriminant Analysis (KCDA) [25], Kernel Subclass Discriminant Analysis (KSDA) [26], etc.…”
Section: Related Workmentioning
confidence: 99%
“…A variety of different discriminative DR techniques has been proposed, such as Fisher's linear discriminant analysis (LDA), partial least squares regression (PLS), informed projections [7], global linear transformations of the metric [14,4], or kernelization of such approaches [25,2]. A general idea which we will use in our approach is to locally modify the metric [28,12] by defining a Riemannian manifold which takes into account auxiliary information of the data and which measures the effect of data dimensions in the feature space on this auxiliary information.…”
Section: Discriminative Dimensionality Reduction Based On the Fisher mentioning
confidence: 99%