Proceedings of the Fourteenth Annual ACM-SIAM Symposium on Discrete Algorithms 2020
DOI: 10.1137/1.9781611975994.9
|View full text |Cite
|
Sign up to set email alerts
|

Oblivious Sketching of High-Degree Polynomial Kernels

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
85
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 25 publications
(85 citation statements)
references
References 0 publications
0
85
0
Order By: Relevance
“…An alternative approach would be to incorporate matrix sketching techniques to allow for more efficient computation of the SVD in the tensorized domain. For example, recent work that focuses on sketches for matrices with Kronecker product structure, such as TensorSketch [30] and its modifications [1] or the Kronecker fast Johnson--Lindenstrauss transform [21], could potentially be adapted to our setting.…”
Section: Discussionmentioning
confidence: 99%
“…An alternative approach would be to incorporate matrix sketching techniques to allow for more efficient computation of the SVD in the tensorized domain. For example, recent work that focuses on sketches for matrices with Kronecker product structure, such as TensorSketch [30] and its modifications [1] or the Kronecker fast Johnson--Lindenstrauss transform [21], could potentially be adapted to our setting.…”
Section: Discussionmentioning
confidence: 99%
“…In the course of adapting fast Johnson-Lindenstrauss embeddings to data with Kronecker structure as introduced in [6] (see also [3,11]), one encounters expressions of the form (X (1) ⊗ • • • ⊗ X (d) ) T A(X (1) ⊗ • • • ⊗ X (d) ) which are somewhat intermediate between ( 1) and ( 2), as they can be expanded as n i 1 ,...,i 2d =1…”
Section: Background and Studied Objectsmentioning
confidence: 99%
“…Possible applications of such results include recent developments in norm-preserving maps for vectors with tensor structure in the context of machine learning methods using the kernel trick [6,3,11].…”
Section: Overview Of Our Contributionmentioning
confidence: 99%
“…Modewise multiplication and j-mode products: For 1 ď j ď d, the j-mode product of d-mode tensor X P R n 1 ˆ¨¨¨ˆn j´1 ˆnj ˆnj`1 ˆ¨¨¨ˆn d with a matrix U P R m j ˆnj is another d-mode tensor X ˆj U P R n 1 ˆ¨¨¨ˆn j´1 ˆmj ˆnj`1 ˆ¨¨¨ˆn d . Its entries are given by (2) pX ˆj U q i 1 ,...,i j´1 , ,i j`1 ,...,i d "…”
Section: Tensor Prerequisitesmentioning
confidence: 99%
“…Other recent work involving the analysis of modewise maps for tensor data include, e.g., applications in kernel learning methods which effectively use modewise operators specialized to finite sets of rank-one tensors [2], as well as a variety of works in the computer science literature aimed at compressing finite sets of low-rank (with respect to, e.g., CP and tensor train decompositions [32]) tensors. More general results involving extensions of bounded orthonormal sampling results to the tensor setting [23,3] apply to finite sets of arbitrary tensors.…”
Section: Introduction and Prior Workmentioning
confidence: 99%