2015
DOI: 10.1109/tip.2015.2441632
|View full text |Cite
|
Sign up to set email alerts
|

Constructing a Nonnegative Low-Rank and Sparse Graph With Data-Adaptive Features

Abstract: This paper aims at constructing a good graph to discover the intrinsic data structures under a semisupervised learning setting. First, we propose to build a nonnegative low-rank and sparse (referred to as NNLRS) graph for the given data representation. In particular, the weights of edges in the graph are obtained by seeking a nonnegative low-rank and sparse reconstruction coefficients matrix that represents each data sample as a linear combination of others. The so-obtained NNLRS-graph captures both the global… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
46
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 104 publications
(46 citation statements)
references
References 31 publications
0
46
0
Order By: Relevance
“…Zhang et al [198] proposed a discriminative tensor sparse coding (RTSC) method for robust image classification. Recently, low-rank based sparse representation became a popular topic such as non-negative low-rank and sparse graph [199]. Some sparse representation methods in face recognition can be found in a review [83] and other more image classification methods can be found in a more recent review [200].…”
Section: Algorithm 17 the Scheme Of Sparse Representation Based Clasmentioning
confidence: 99%
“…Zhang et al [198] proposed a discriminative tensor sparse coding (RTSC) method for robust image classification. Recently, low-rank based sparse representation became a popular topic such as non-negative low-rank and sparse graph [199]. Some sparse representation methods in face recognition can be found in a review [83] and other more image classification methods can be found in a more recent review [200].…”
Section: Algorithm 17 the Scheme Of Sparse Representation Based Clasmentioning
confidence: 99%
“…Same as [40], [41], [42], the image samples used in this paper are normalized to unit length (i.e., the ℓ2-norm of each image vector equals to one). As the accuracy of our method is not sensitive to and , we fix = 0.1, = 1 for all the experiments in this paper.…”
Section: Resultsmentioning
confidence: 99%
“…Similar to [39], [40], [41], [52], the proposed objective function (5) of LRD 2 L is nonconvex to the unknown variables. Since the proposed LRD 2 L algorithm can only converge to a local minimum, the initialization of each dictionary is important for achieving a desirable solution.…”
Section: Complete Algorithm Of Lrd 2 Lmentioning
confidence: 99%
“…He et al [28] proposed a nonnegative sparse algorithm to derive the graph weights for graph-based SSL. Besides the sparsity property, Zhuang et al [29], [30] also imposed low-rank constraints to estimate the weight matrix of the pairwise relationship graph for SSL. The main difference between them and our proposed method S 3 RC is that the previous works used sparse/low-rank technologies to learn the weight matrix for graph-based SSL, which are essentially SSL methods.…”
Section: Related Workmentioning
confidence: 99%