Proceedings of the 52nd Annual ACM SIGACT Symposium on Theory of Computing 2020
DOI: 10.1145/3357713.3384314
|View full text |Cite
|
Sign up to set email alerts
|

Sampling-based sublinear low-rank matrix arithmetic framework for dequantizing Quantum machine learning

Abstract: We present an algorithmic framework for quantum-inspired classical algorithms on close-to-low-rank matrices, generalizing the series of results started by Tang's breakthrough quantum-inspired algorithm for recommendation systems [STOC'19]. Motivated by quantum linear algebra algorithms and the quantum singular value transformation (SVT) framework of Gilyén et al. [STOC'19], we develop classical algorithms for SVT that run in time independent of input dimension, under suitable quantum-inspired sampling assumpti… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
111
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
9

Relationship

1
8

Authors

Journals

citations
Cited by 73 publications
(112 citation statements)
references
References 38 publications
1
111
0
Order By: Relevance
“…As far as we know, the only work along this direction is by [75], which gives a quantum algorithm for finding the negative curvature of a point in timeÕ(poly(r, 1/ )), where r is the rank of the Hessian at that point. However, the algorithm has a few drawbacks: 1) The cost is expensive when r = Θ(n); 2) It relies on a quantum data structure [52] which can actually be dequantized to classical algorithms with comparable cost [20,66,67]; 3) It can only find the negative curvature for a fixed Hessian. In all, it is unclear whether this quantum algorithm achieves speedup for escaping from saddle points.…”
Section: Related Workmentioning
confidence: 99%
“…As far as we know, the only work along this direction is by [75], which gives a quantum algorithm for finding the negative curvature of a point in timeÕ(poly(r, 1/ )), where r is the rank of the Hessian at that point. However, the algorithm has a few drawbacks: 1) The cost is expensive when r = Θ(n); 2) It relies on a quantum data structure [52] which can actually be dequantized to classical algorithms with comparable cost [20,66,67]; 3) It can only find the negative curvature for a fixed Hessian. In all, it is unclear whether this quantum algorithm achieves speedup for escaping from saddle points.…”
Section: Related Workmentioning
confidence: 99%
“…The fundamental difference between quantuminspired algorithms and traditional classical algorithms is that via importance sampling, their runtime is independent of the dimension of input data, and thus it builds a setting comparable with quantum machine learning algorithms aided by QRAM. Recently (Chia et al 2020) introduced an algorithmic framework for quantum-inspired classical algorithms on low-rank matrices that generalizes a series of previous work, recovers existing quantum-inspired algorithms such as quantum-inspired recommendation systems (Tang 2019), quantum-inspired principal component analysis (Tang 2018), quantum-inspired low-rank matrix inversion (Gilyén et al 2018), and quantum-inspired support vector machine (Ding et al 2019).…”
Section: Complexity Quantum Semi-supervised Ls-svmmentioning
confidence: 86%
“…Among the most relevant results obtained in Quantum Machine Learning it is worth mentioning the use of trainable parametrized digital and continuous-variable quantum circuits as a model for quantum neural networks [12]- [21], the realization of quantum Support Vector Machines (qSVMs) [22] working in quantum-enhanced feature spaces [23], [24] and the introduction of quantum versions of artificial neuron models [25]- [32]. However, it is true that very few clear statements have been made concerning the concrete and quantitative achievement of quantum advantage in machine learning applications, and many challenges still need to be addressed [8], [33], [34].…”
Section: Introductionmentioning
confidence: 99%