Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence 2017
DOI: 10.24963/ijcai.2017/306
|View full text |Cite
|
Sign up to set email alerts
|

Locality Adaptive Discriminant Analysis

Abstract: Linear Discriminant Analysis (LDA) is a popular technique for supervised dimensionality reduction, and its performance is satisfying when dealing with Gaussian distributed data. However, the neglect of local data structure makes LDA inapplicable to many real-world situations. So some works focus on the discriminant analysis between neighbor points, which can be easily affected by the noise in the original data space. In this paper, we propose a new supervised dimensionality reduction method, Locality Adaptive … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
43
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
4
1

Relationship

1
8

Authors

Journals

citations
Cited by 137 publications
(43 citation statements)
references
References 18 publications
0
43
0
Order By: Relevance
“…Examples of the mapping function include unsupervised dimensionality reductions, such as principal component analysis (PCA) (Pearson 1901;Jolliffe 1986); locality preserving projections (LPP) (He and Niyogi 2004); and supervised dimensionality reductions, such as Fisher discriminant analysis (FDA) (Fisher 1936), local FDA (LFDA) (Sugiyama 2007), semi-supervised LFDA (SELF) (Sugiyama et al 2010), locality adaptive discriminant analysis (LADA) (Li et al 2017); and complex moment-based supervised eigenmap (CMSE) (Imakura et al 2019). One can also consider a partial structure of deep neural networks.…”
Section: Fundamental Concept and Frameworkmentioning
confidence: 99%
“…Examples of the mapping function include unsupervised dimensionality reductions, such as principal component analysis (PCA) (Pearson 1901;Jolliffe 1986); locality preserving projections (LPP) (He and Niyogi 2004); and supervised dimensionality reductions, such as Fisher discriminant analysis (FDA) (Fisher 1936), local FDA (LFDA) (Sugiyama 2007), semi-supervised LFDA (SELF) (Sugiyama et al 2010), locality adaptive discriminant analysis (LADA) (Li et al 2017); and complex moment-based supervised eigenmap (CMSE) (Imakura et al 2019). One can also consider a partial structure of deep neural networks.…”
Section: Fundamental Concept and Frameworkmentioning
confidence: 99%
“…We compare the proposed method (SFS) with previous supervised dimensionality reduction methods in terms of accuracy by numerical experiments on artificial data and practical datasets and show that the proposed method outperforms previous methods in some cases, and is more robust than the previous methods. The previous methods were kernel versions of LDA (KDA) [2,6] and LFDA (KLFDA) [25] and LADA [17]. The performance measures used for comparisons were overall accuracy (OA), average accuracy (AA), and normalized mutual information (NMI) [%] [23].…”
Section: Numerical Experimentsmentioning
confidence: 99%
“…A family of discriminant analysis methods are proposed for dimensionality reduction, including Fisher discriminant analysis (FDA) (Fisher 1936;Fukunaga 2013), local FDA (LFDA) (Sugiyama 2007), semi-supervised LFDA (SELF) (Sugiyama et al 2010) and locality adaptive discriminant analysis (LADA) (Li et al 2017).…”
Section: Introductionmentioning
confidence: 99%