2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2011
DOI: 10.1109/icassp.2011.5946732
|View full text |Cite
|
Sign up to set email alerts
|

A kernelized maximal-figure-of-merit learning approach based on subspace distance minimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2014
2014
2014
2014

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 10 publications
0
1
0
Order By: Relevance
“…There are mainly two branches of methods to deal with the problem. One is to use nonlinear discriminant function like quadratic functions [11], and the other is to apply kernel functions to map features to high dimension Hilbert space [12]. Using a nonlinear discriminant function will make the objective function hard to solve, while kernelization approaches need to deal with the increased dimensions when there are hundreds of thousands of training samples.…”
Section: Introductionmentioning
confidence: 99%
“…There are mainly two branches of methods to deal with the problem. One is to use nonlinear discriminant function like quadratic functions [11], and the other is to apply kernel functions to map features to high dimension Hilbert space [12]. Using a nonlinear discriminant function will make the objective function hard to solve, while kernelization approaches need to deal with the increased dimensions when there are hundreds of thousands of training samples.…”
Section: Introductionmentioning
confidence: 99%