2022
DOI: 10.48550/arxiv.2201.08712
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Improved Random Features for Dot Product Kernels

Abstract: Dot product kernels, such as polynomial and exponential (softmax) kernels, are among the most widely used kernels in machine learning, as they enable modeling the interactions between input features, which is crucial in applications like computer vision, natural language processing, and recommender systems. We make several novel contributions for improving the efficiency of random feature approximations for dot product kernels, to make these kernels more useful in large scale learning. First, we present a gene… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 17 publications
0
2
0
Order By: Relevance
“…Complex random feature maps have been studied for the linear kernel by Choromanski et al (2017) and have been generalized to polynomial sketches in Wacker et al (2022).…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Complex random feature maps have been studied for the linear kernel by Choromanski et al (2017) and have been generalized to polynomial sketches in Wacker et al (2022).…”
Section: Related Workmentioning
confidence: 99%
“…We study randomized approximations of the polynomial kernel that have originally been proposed by Kar & Karnick (2012); Hamid et al (2014). They have been reviewed and generalized to complex projections by Wacker et al (2022). We revisit these sketches here and present the general complex case that subsumes real polynomial sketches.…”
Section: Real and Complex-valued Polynomial Sketchesmentioning
confidence: 99%