2006
DOI: 10.1007/11840930_43
|View full text |Cite
|
Sign up to set email alerts
|

An Evolutionary Approach to Automatic Kernel Construction

Abstract: Abstract. Kernel-based learning presents a unified approach to machine learning problems such as classification and regression. The selection of a kernel and associated parameters is a critical step in the application of any kernel-based method to a problem. This paper presents a data-driven evolutionary approach for constructing kernels, named KTree. An application of KTree to the Support Vector Machine (SVM) classifier is described. Experiments on a synthetic dataset are used to determine the best evolutiona… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
23
0

Year Published

2010
2010
2020
2020

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 18 publications
(23 citation statements)
references
References 8 publications
0
23
0
Order By: Relevance
“…Another point to note here is that in OSVMs, the kernels that have been used mostly are Linear, Polynomial, Gaussian or Sigmoidal. We suggest it would be fruitful to investigate some more innovative forms of kernel, for example Genetic Kernels [47], that have shown greater potential in standard SVM classification. In the case where abundant unlabeled examples and some positive examples are available, researchers have used many two-step algorithms (as have been discussed in Section 3.3).…”
Section: Discussionmentioning
confidence: 98%
“…Another point to note here is that in OSVMs, the kernels that have been used mostly are Linear, Polynomial, Gaussian or Sigmoidal. We suggest it would be fruitful to investigate some more innovative forms of kernel, for example Genetic Kernels [47], that have shown greater potential in standard SVM classification. In the case where abundant unlabeled examples and some positive examples are available, researchers have used many two-step algorithms (as have been discussed in Section 3.3).…”
Section: Discussionmentioning
confidence: 98%
“…The macro level is a GA that evolves the kernel shape and parameters for a multiple kernel function. The steady-state genetic algorithm(SSGA) [7][12] is used as underlying mechanism for GA implementation. The micro level algorithm is a RVR algorithm used for computing the fitness of each GA individual from the macro level.…”
Section: B Genetic Evolution Methods Of Complex Multiple Kernel Functionmentioning
confidence: 99%
“…However these researches just selected the multiple kernel(MK) by experience and did not solve how to design a suitable MK automatically. Several recent studies [7][8] [9] have proposed optimization methods which used a genetic algorithm for optimizing the SVM kernels and parameters, but these methods inherit the drawbacks of the nonprobabilistic and Mercer's kernel limits of SVM. This paper uses similar optimization methods to solve the model selection of relevance vector regression.…”
Section: Introductionmentioning
confidence: 99%
“…While some standard kernels proposed in the literature are straightforwardly used in several applications, tailored kernels produce much better results as each problem has specific characteristics [42,43]. In order to achieve an automated machine learning approach, several works in the literature pose the kernel selection as a search problem in the space of kernels with no human intervention [25,12,28].…”
Section: Introductionmentioning
confidence: 99%
“…Once the space of possible kernels has been defined, the next relevant question is the selection of a strategy to carry out the search. Most of the works in the literature have proposed various heuristic algorithms to solve this search problem, Genetic Programming (GP) being one of the most used methods [25,12,28]. However, there is a lack of knowledge about many aspects related to the specific characteristics of the kernel function optimization problem, and in particular, about how these characteristics relate to the way the GP search for optimal solutions is accomplished.…”
Section: Introductionmentioning
confidence: 99%