2011
DOI: 10.1016/j.patrec.2011.01.012
|View full text |Cite
|
Sign up to set email alerts
|

Manifold-respecting discriminant nonnegative matrix factorization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 32 publications
(13 citation statements)
references
References 12 publications
0
13
0
Order By: Relevance
“…NMF on a manifold emerges when the data lies in a nonlinear low-dimensional submanifold ( Cai et al, 2008 ). Manifold Regularized Discriminative NMF ( Ana et al, 2011 ; Guan et al, 2011 ) were proposed with special constraints to preserve local invariance, so as to reflect the multilateral characteristics.…”
Section: Related Workmentioning
confidence: 99%
“…NMF on a manifold emerges when the data lies in a nonlinear low-dimensional submanifold ( Cai et al, 2008 ). Manifold Regularized Discriminative NMF ( Ana et al, 2011 ; Guan et al, 2011 ) were proposed with special constraints to preserve local invariance, so as to reflect the multilateral characteristics.…”
Section: Related Workmentioning
confidence: 99%
“…In order to integrate manifold learning into NMF framework, a graph regularized NMF (GNMF) algorithm is firstly proposed in [6]; then some GNMF variants are introduced in [35][36][37][38].…”
Section: Related Workmentioning
confidence: 99%
“…We denote l i as the label of x i . Recent studies on spectral graph theory and manifold learning theory have demonstrated that the local geometric structure can be effectively modelled through a NN graph on a scatter of data points, so in order to exploit both the geometrical structure of data and label information, An et al [12] constructed the adjacency matrix for the intra-class K-NN graph W W by using binary weights indicating neighbourhood relationships as…”
Section: Standard Nmfmentioning
confidence: 99%
“…Moreover, the numerical value of K varied from 1 to 10. Also, because we want to compare the best average results between NMF-K-NN algorithm and our proposed algorithm, we chose the regularisation parameter α and the dimensionality of features with the same as the literature [12], that can be set by [0.01, 0.1, 1, 10, 100] and [200, 195, 190, … ,10, 5], respectively. Since X can be approximated by columnwise UV T , we can naturally project a sample x i from the original high-dimensional space to the low-dimensional space or equivalently y = U † x i , wherein the projection matrix U † = (U T U) − 1 U T is the pseudo-inverse of U.…”
Section: Face Recognitionmentioning
confidence: 99%