2008
DOI: 10.1214/009053607000000677
|View full text |Cite
|
Sign up to set email alerts
|

Kernel methods in machine learning

Abstract: We review machine learning methods employing positive definite kernels. These methods formulate learning and estimation problems in a reproducing kernel Hilbert space (RKHS) of functions defined on the data domain, expanded in terms of a kernel. Working in linear spaces of function has the benefit of facilitating the construction and analysis of learning algorithms while at the same time allowing large classes of functions. The latter include nonlinear functions as well as functions defined on nonvectorial dat… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
1,245
0
10

Year Published

2009
2009
2024
2024

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 1,803 publications
(1,257 citation statements)
references
References 110 publications
2
1,245
0
10
Order By: Relevance
“…It can be shown that even in the infinite dimensional case, this norm can be evaluated using the kernel. For more details see (Jäkel et al, 2007;Hofmann et al, 2008;Schölkopf & Smola, 2002).…”
Section: Box 2 Kernel Methods and Exemplar Modelsmentioning
confidence: 99%
See 2 more Smart Citations
“…It can be shown that even in the infinite dimensional case, this norm can be evaluated using the kernel. For more details see (Jäkel et al, 2007;Hofmann et al, 2008;Schölkopf & Smola, 2002).…”
Section: Box 2 Kernel Methods and Exemplar Modelsmentioning
confidence: 99%
“…Here, we will focus on a different set of techniques from machine learning, called kernel methods (Jäkel, Schölkopf, & Wichmann, 2007;Hofmann, Schölkopf, & Smola, 2008;Schölkopf & Smola, 2002). Contrary to multilayer neural networks, kernel methods are linear methods, in a way we will describe in more detail below.…”
Section: Learning In Humans and Machinesmentioning
confidence: 99%
See 1 more Smart Citation
“…However, evaluating and choosing a kernel function is often done experimentally. There exist different types of kernel functions that can be used in SVMs [25,6,28].…”
Section: Constructionmentioning
confidence: 99%
“…Kernel density estimation (KDE) is an established non-parametric approach that is widely used in pattern analysis, computer vision, dimensionality reduction, and clustering [68,69,70,71]. In the KDE literature, methods that tackle overall computational complexity primarily approach from the perspective of reducing pairwise kernel evaluations; examples include Nystrom approximation [72], Fast Gauss Transform [73,74] and sparse dictionary learning methods [75].…”
Section: Introductionmentioning
confidence: 99%