CVPR 2011 2011
DOI: 10.1109/cvpr.2011.5995484
|View full text |Cite
|
Sign up to set email alerts
|

Image classification by non-negative sparse coding, low-rank and sparse decomposition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
69
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 157 publications
(70 citation statements)
references
References 22 publications
1
69
0
Order By: Relevance
“…This process is time-consuming and might increase the redundancy in each sub-dictionary, thus not guaranteeing consistency of sparse codes for signals from the same class. [31] presents an image classification framework by using non-negative sparse coding, low-rank and sparse matrix decomposition. A linear SVM classifier is used for the final classification.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…This process is time-consuming and might increase the redundancy in each sub-dictionary, thus not guaranteeing consistency of sparse codes for signals from the same class. [31] presents an image classification framework by using non-negative sparse coding, low-rank and sparse matrix decomposition. A linear SVM classifier is used for the final classification.…”
Section: Related Workmentioning
confidence: 99%
“…Low-rank matrix recovery, which determines a low-rank data matrix from corrupted input data, has been successfully applied to applications including salient object detection [24], segmentation and grouping [35,13,6], background subtraction [7], tracking [34], and 3D visual recovery [13,31]. However, there is limited work [5,19] using this technique for multi-class classification.…”
Section: Introductionmentioning
confidence: 99%
“…These methods include local binary pattern [5], spectral-texture feature [7], census transform-structure feature [1], traditional sparse coding (T SC) [8], the nonnegative sparse coding (NN SC) [12], laplacian sparse coding (LSC) [13], and K-SVD sparse coding [14]. For all the coding models, we use SVM as the classifier.…”
Section: Resultsmentioning
confidence: 99%
“…The core idea is to reconstruct a feature with codewords via resolving a leastsquares-based optimization problem with constraints on the codewords and reconstruction coefficients [Huang et al 2013;Cong et al 2011Cong et al , 2013Shao and Tan 2014]. While there are many variations in the recent literature, such as Laplacian sparse coding [Gao et al 2010], mixture sparse coding [Yang et al 2010], multilayer group sparse coding [Gao et al 2011], and nonnegative sparse coding [Zhang et al 2011], the unified representation of sparse coding can be generally written as Given feature descriptors X, the conventional way to solve Equation (1) is to solve it iteratively by alternately optimizing over U or V while fixing the other one, such as ScSPM (Spatial Pyramid Matching based on sparse coding) [Yang et al 2009]. Another kind of method generates the codebook V by clustering (e.g., k-means) in advance and does not change it when obtains the reconstruction coefficients, such as BOW.…”
Section: Review Of Sparse Codingmentioning
confidence: 99%
“…The Vector of Locally Aggregated Descriptors (VLAD) [Jegou et al 2010[Jegou et al , 2012 can be regarded as a simplified version of Fisher vector coding that does not store second-order information about the features. In addition, sparse coding with different constraints on the codewords has also been widely applied for image classification [Yang et al 2009;Zhang et al 2011].…”
Section: Introductionmentioning
confidence: 99%