2021
DOI: 10.48550/arxiv.2101.09581
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Machine learning of high dimensional data on a noisy quantum processor

Abstract: We present a quantum kernel method for high-dimensional data analysis using Google's universal quantum processor, Sycamore. This method is successfully applied to the cosmological benchmark of supernova classification using real spectral features with no dimensionality reduction and without vanishing kernel elements. Instead of using a synthetic dataset of low dimension or pre-processing the data with a classical machine learning algorithm to reduce the data dimension, this experiment demonstrates that machine… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
20
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(20 citation statements)
references
References 21 publications
0
20
0
Order By: Relevance
“…The most prominent approach to construct learning models using NISQ devices relies on the use of parametrized quantum circuits (PQCs) [11][12][13][14]. Kernel methods in particular have emerged as one particular candidate to realize QML models [15][16][17][18][19][20][21]. Furthermore, it was recently shown that other types of variational quantum learning models are Figure 1.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The most prominent approach to construct learning models using NISQ devices relies on the use of parametrized quantum circuits (PQCs) [11][12][13][14]. Kernel methods in particular have emerged as one particular candidate to realize QML models [15][16][17][18][19][20][21]. Furthermore, it was recently shown that other types of variational quantum learning models are Figure 1.…”
Section: Introductionmentioning
confidence: 99%
“…We also analyze the influence of finite sampling on kernel quality -an issue touched upon in Ref. [21] -showing that the number of samples typically required to achieve an approximation of fixed precision is of third order in the number of datapoints. We review preexisting strategies that can be used to alleviate the influence of noise on the kernel matrix and propose a different one based on a semi-definite program.…”
Section: Introductionmentioning
confidence: 99%
“…Pioneered experimental explorations have validated the crucial role of ansatz when applying VQAs to accomplish tasks in different fields such as machine learning [30][31][32][33],…”
mentioning
confidence: 99%
“…One possible route to quantum advantage in machine learning is the use of quantum embedding kernels [3][4][5], where quantum computers are used to encode data in ways that are difficult for classical machine learning methods [6][7][8]. Noisy intermediate scale quantum computers [9,10] are capable of solving tasks difficult for classical computers [11,12] and have shown promise in running proof-of-principle quantum machine learning applications [13][14][15][16][17][18][19][20][21][22][23]. However, major bottlenecks still limit the use of quantum hardware for machine learning with practical applications.…”
mentioning
confidence: 99%