2024
DOI: 10.1088/2058-9565/ad152e
|View full text |Cite
|
Sign up to set email alerts
|

Building spatial symmetries into parameterized quantum circuits for faster training

Frédéric Sauvage,
Martín Larocca,
Patrick J Coles
et al.

Abstract: Practical success of quantum learning models hinges on having a suitable structure for the parameterized quantum circuit. Such structure is defined both by the types of gates employed and by the correlations of their parameters. While much research has been devoted to devising adequate gate-sets, typically respecting some symmetries of the problem, very little is known about how their parameters should be structured. In this work, we show that an ideal parameter structure naturally
emerges when careful… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 16 publications
(4 citation statements)
references
References 61 publications
0
4
0
Order By: Relevance
“…Namely, highly expressive encodings (whether using fidelity or projected kernels) should be avoided. Or, more concretely, unstructured data-embeddings 11 , 15 , 46 , 68 should generally be avoided and the data structure should be taken into account when designing a data-embedding (for instance by constructing geometrically inspired embedding schemes 38 41 ).…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Namely, highly expressive encodings (whether using fidelity or projected kernels) should be avoided. Or, more concretely, unstructured data-embeddings 11 , 15 , 46 , 68 should generally be avoided and the data structure should be taken into account when designing a data-embedding (for instance by constructing geometrically inspired embedding schemes 38 41 ).…”
Section: Resultsmentioning
confidence: 99%
“…Our work provides a systematic study of the barriers to the successful scaling up of quantum kernel methods posed by exponential concentration. Prior work on BPs motivated the community to search for ways to avoid or mitigate BPs such as employing correlated parameters 85 using tools from quantum optimal control 22 , 86 , or developing the field of geometrical quantum machine learning 38 41 . In a similar manner, we stress our results should not be understood as condemning quantum kernel methods, but rather a prompt to develop exponential-concentration-free embeddings for quantum kernels.…”
Section: Discussionmentioning
confidence: 99%
“…The scalability of our protocol is highly related to the expressivity and trainability of PQCs, which have been extensively discussed in current research of variational quantum algorithms and quantum neural networks [30,31,[41][42][43][44][45]. Despite the difficulties of scaling a general PQCs, some prior knowledge about U, like the sparsity, locality and symmetry of the generator Hamiltonian, can usually be accessible and used to enhance the PQCs' performance nearby U while preserving a low circuit depth [8,46]. It is worth mentioning that a recent work [47] introduces a data quantum Fisher information metric to measure the model performance, which may be adopted here as an indicator for adaptively designing the PQCs' architecture even in the more general case.…”
Section: Conclusion and Discussionmentioning
confidence: 99%
“…Numerous endeavors have been undertaken to create learning models that are tailored specifically to a given task. Among these, geometric QML (GQML) has emerged as one of the most promising approaches [8][9][10][11][12][13][14][15]. The fundamental idea behind GQML is to leverage the symmetries present in the task to develop sharp inductive biases for the learning models.…”
Section: Introductionmentioning
confidence: 99%