2021
DOI: 10.48550/arxiv.2105.07228
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Universality and Optimality of Structured Deep Kernel Networks

Abstract: Kernel based methods yield approximation models that are flexible, efficient and powerful. In particular, they utilize fixed feature maps of the data, being often associated to strong analytical results that prove their accuracy. On the other hand, the recent success of machine learning methods has been driven by deep neural networks (NNs). They achieve a significant accuracy on very high-dimensional data, in that they are able to learn also efficient data representations or data-based feature maps.In this pap… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 23 publications
0
2
0
Order By: Relevance
“…However, due to being kernel models that can be optimized instead of fixed activation functions, they allow for more flexibility and thus enable a potentially faster optimization. As analyzed in [10], these SDKNs enjoy universal approximation properties in various limit cases. The 2L-VKOGA [9] combines the use of an optimized two-layered kernel with greedy kernel algorithms, where we leverage the VKOGA algorithm [7]: Radial basis function kernels are a popular class of kernels which are given as k(x, z) = Φ( x − z ) for some radial basis function Φ : R d → R, e.g.…”
Section: Deep Kernel Modelsmentioning
confidence: 98%
See 1 more Smart Citation
“…However, due to being kernel models that can be optimized instead of fixed activation functions, they allow for more flexibility and thus enable a potentially faster optimization. As analyzed in [10], these SDKNs enjoy universal approximation properties in various limit cases. The 2L-VKOGA [9] combines the use of an optimized two-layered kernel with greedy kernel algorithms, where we leverage the VKOGA algorithm [7]: Radial basis function kernels are a popular class of kernels which are given as k(x, z) = Φ( x − z ) for some radial basis function Φ : R d → R, e.g.…”
Section: Deep Kernel Modelsmentioning
confidence: 98%
“…In order to make the best out of both approaches, the idea is to combine deep learning methods such as neural networks with the benefits of "shallow" kernel methods. In the following Section 4, we review two recently introduced algorithms in this direction: SDKNs [10] as well as 2L-VKOGA [9].…”
Section: Machine Learning For Regressionmentioning
confidence: 99%