2017
DOI: 10.1016/j.patcog.2016.08.024
|View full text |Cite
|
Sign up to set email alerts
|

Regularized coplanar discriminant analysis for dimensionality reduction

Abstract: The dimensionality reduction methods based on linear embedding, such as neighborhood preserving embedding (NPE), sparsity preserving projections (SPP) and collaborative representation based projections (CRP), try to preserve a certain kind of linear representation for each sample after projection. However, in the transformed low-dimensional space, the linear relationship between the samples may be changed, which can not make the linear representation-based classifiers, such as sparse representation-based class… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 34 publications
(10 citation statements)
references
References 52 publications
0
10
0
Order By: Relevance
“…where α is a compromise parameter. W ij is the spatial-spectral matrix in Equation (14) and G ij is still the weight matrix representing the nearest neighbor relationship. If x i and x j are neighbors,…”
Section: Iss-wme Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…where α is a compromise parameter. W ij is the spatial-spectral matrix in Equation (14) and G ij is still the weight matrix representing the nearest neighbor relationship. If x i and x j are neighbors,…”
Section: Iss-wme Modelmentioning
confidence: 99%
“…The purpose of the MDR method in HSI is to find the manifold structure in the high-dimensional space. LLE [14] obtains the reconstruction weight by characterizing the local adjacency sample of the data and keeps the neighborhood relationship in the local range unchanged when mapping to the low-dimensional space. However, an LLE algorithm only determines the neighbor relationship between points and cannot describe the structural features of data.…”
Section: Introductionmentioning
confidence: 99%
“…As described in the aforementioned analysis, SPP ignores the label information of samples and cannot well exhibit the local neighborhood relationship of adjacent graph. Although a few improved algorithms [39][40][41][42] have been proposed, their performance is still limited. In this paper, we propose a DSGE which consists of two improvements on adjacent graph construction and low dimensional projection, respectively.…”
Section: Discriminative Sparsity Graph Embeddingmentioning
confidence: 99%
“…These above supervised learning methods [39][40][41][42] utilize the local neighborhood information of intra-class samples and inter-class samples respectively, but they ignore the global distribution information of all samples in space. In fact, researchers have shown that the global geometric structure of data sets implies useful discriminative information which is important for image identification [43,44].…”
Section: Introductionmentioning
confidence: 99%
“…Supervised algorithms conduct datasets with labels that aim to present better performance and low complexity. Linear discriminant analysis (LDA), local discriminant embedding (LDE) [ 7 ], discriminant sparse neighborhood preserving embedding (DSNPE) [ 8 ], regularized coplanar discriminant analysis (RCDA) [ 9 ], marginal Fisher analysis (MFA) [ 3 , 5 , 10 ], discriminant neighborhood embedding (DNE) [ 11 ], locality-based discriminant neighborhood embedding(LDNE) [ 12 ], and double adjacency graphs-based discriminant neighborhood embedding (DAG-DNE) [ 13 ] are typical supervised algorithms.…”
Section: Introductionmentioning
confidence: 99%