2016 International Joint Conference on Neural Networks (IJCNN) 2016
DOI: 10.1109/ijcnn.2016.7727616
|View full text |Cite
|
Sign up to set email alerts
|

Manifold locality constrained low-rank representation and its applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 19 publications
0
2
0
Order By: Relevance
“…where ∥Z∥ 1 is the l 1 -norm to guarantee the sparsity of coefficient matrix. In real-world applications, the sparsity and nonnegativity matrix Z obtained by the NNLRS method can offer a basis for semi-supervised learning by constructing the discriminative and informative graph (You et al, 2016).…”
Section: Non-negative Lrr With Sparsitymentioning
confidence: 99%
See 1 more Smart Citation
“…where ∥Z∥ 1 is the l 1 -norm to guarantee the sparsity of coefficient matrix. In real-world applications, the sparsity and nonnegativity matrix Z obtained by the NNLRS method can offer a basis for semi-supervised learning by constructing the discriminative and informative graph (You et al, 2016).…”
Section: Non-negative Lrr With Sparsitymentioning
confidence: 99%
“…The LRR method can find a low-rank matrix to capture and represent the global structure of the raw dataset (Liu et al, 2010). The key to the LRR method is that the high-dimensional data can be represented by potential low-dimensional subspaces (You et al, 2016). In bioinformatics, LRR has achieved great success in gene expression data mining.…”
Section: Introductionmentioning
confidence: 99%