2014
DOI: 10.1109/tip.2014.2324290
|View full text |Cite
|
Sign up to set email alerts
|

Multiple Kernel Learning for Sparse Representation-Based Classification

Abstract: In this paper, we propose a multiple kernel learning (MKL) algorithm that is based on the sparse representation-based classification (SRC) method. Taking advantage of the nonlinear kernel SRC in efficiently representing the nonlinearities in the high-dimensional feature space, we propose an MKL method based on the kernel alignment criteria. Our method uses a two step training method to learn the kernel weights and sparse codes. At each iteration, the sparse codes are updated first while fixing the kernel mixin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
43
0

Year Published

2015
2015
2018
2018

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 89 publications
(43 citation statements)
references
References 39 publications
0
43
0
Order By: Relevance
“…As a result, we used a polynomial kernel of degree 4 in our experiments. Several methods have been proposed in the literature for optimizing the choice of kernel and kernel parameters such as cross validation and multiple kernel learning (Shrivastava et al 2014a). However, these methods tend to make the optimization problem very complex and time consuming.…”
Section: Resultsmentioning
confidence: 99%
“…As a result, we used a polynomial kernel of degree 4 in our experiments. Several methods have been proposed in the literature for optimizing the choice of kernel and kernel parameters such as cross validation and multiple kernel learning (Shrivastava et al 2014a). However, these methods tend to make the optimization problem very complex and time consuming.…”
Section: Resultsmentioning
confidence: 99%
“…There is a very rich literature on image classification including methods based on bag of word [1,2], Sparse representation [3][4][5][6][7], and Deep learning [8][9][10]. We should point out that nonlinear classifiers, including kernel based ones, have gained more attention due to their high performance compared to linear classifiers [5,7,9].…”
Section: Introductionmentioning
confidence: 99%
“…We should point out that nonlinear classifiers, including kernel based ones, have gained more attention due to their high performance compared to linear classifiers [5,7,9].…”
Section: Introductionmentioning
confidence: 99%
“…However, in many practical applications, linear representations are not able to represent the non-linear structures of data. To address this issue, many efforts have been devoted to developing kernel sparse representation classification (KSRC) [23]- [27]. Differing from those methods that find sparse representation coefficients in the original space, these kernel-based methods first map the original data into a high-dimensional feature space, and then learn sparse representation in the obtained kernel space.…”
mentioning
confidence: 99%