2015
DOI: 10.1117/12.2197924
|View full text |Cite
|
Sign up to set email alerts
|

Differential evolution algorithm-based kernel parameter selection for Fukunaga-Koontz Transform subspaces construction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2016
2016
2018
2018

Publication Types

Select...
5
1

Relationship

3
3

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 7 publications
0
4
0
Order By: Relevance
“…For classification problems, a parameter selection technique for RBF in KFKT is presented by Binol et al [22]. The independent variable of RBF, σ, is not optimized in this study.…”
Section: Data Sets and Parameter Settingsmentioning
confidence: 98%
“…For classification problems, a parameter selection technique for RBF in KFKT is presented by Binol et al [22]. The independent variable of RBF, σ, is not optimized in this study.…”
Section: Data Sets and Parameter Settingsmentioning
confidence: 98%
“…Namely, the selection of the optimal kernel and parameters is crucial for KPCA to achieve good performance. However, the application results show that no single kernel function can be best for all kinds of machine learning problems [22] and, therefore, learning of optimum kernels over a kernel set is an active research area nowadays [23][24][25][26][27]. Li and Yang presented an ensemble KPCA method with Bayesian inference strategy in [28].…”
Section: Introductionmentioning
confidence: 99%
“…At present, studies in the kernel function theory mainly concentrated in three aspects: the first is the properties of the kernel function [7]; the second is the structure (or improved) method of the kernel function [8,9]; the third is the parameter selection of kernel function. [10] There are many kinds of kernel functions, commonly used kernel functions are the following: linear kernel function, polynomial kernel function, radial basis kernel function (RBF) and sigmoid kernel function. In 2003, Keerthi et al [11] studied the RBF kernel and its two involved kernel parameters (penalty parameter c and the kernel width g), and also analyzed the performance of the SVM classifier while kernel parameters took different values; the results showed that if RBF kernel model was used, there was no need to consider the linear support vector machine; Lin et al [12] explained that the sigmoid kernel and RBF kernel had similar performance in certain parameters.…”
Section: Introductionmentioning
confidence: 99%