2021
DOI: 10.1038/s43588-021-00084-1
|View full text |Cite
|
Sign up to set email alerts
|

The power of quantum neural networks

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

4
390
1

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 535 publications
(395 citation statements)
references
References 35 publications
4
390
1
Order By: Relevance
“…Given the fundamental role of generalization bounds, there has recently been a strong and steady stream of works contributing to the derivation of generalization bounds for PQC-based models [24][25][26][27][28][29][30][31][32]. However, as discussed in detail in Section 4, these prior works all differ from our results in a variety of ways.…”
Section: Introductionmentioning
confidence: 83%
See 1 more Smart Citation
“…Given the fundamental role of generalization bounds, there has recently been a strong and steady stream of works contributing to the derivation of generalization bounds for PQC-based models [24][25][26][27][28][29][30][31][32]. However, as discussed in detail in Section 4, these prior works all differ from our results in a variety of ways.…”
Section: Introductionmentioning
confidence: 83%
“…With this in mind, we begin our survey of implicitly encoding-dependent generalization bounds with Ref. [25], which has suggested a complexity measure based on the classical Fisher information, called the effective dimension, and demonstrated that one can indeed state generalization bounds in terms of the effective dimension. Utilizing the empirical Fisher information as a tool for approximating the effective dimension, Ref.…”
Section: Encoding-dependent Complexity and Generalization Boundsmentioning
confidence: 99%
“…Gradient scaling is one of the few directions of significant progress. The most famous gradient scaling result is the barren plateau phenomenon [19][20][21][22][23][24][25][26][27][28][29][30][31][32][33], whereby the gradient of the cost function shrinks exponentially with the number of qubits. Various issues lead to barren plateaus, such as deep ansatzes that lack structure [19,21,31], global cost functions [20,21], high levels of noise [22,32], scrambling target unitaries [24], and large entanglement [28,29].…”
Section: Introductionmentioning
confidence: 99%
“…The authors of Ref. [73] have looked at the classical Fisher information of a parametrized quantum circuit to quantify its capacity [73]. They define a new capacity measure they call the effective dimension which can be used to bound how well a variational quantum learning model can generalize on unseen data.…”
Section: Analyzing Quantum Learning Modelsmentioning
confidence: 99%
“…It does not take into account the number of available datapoints and should therefore not be confused with the quantum generalization of the effective dimension of Ref. [73]. It is still an intuitive measure as the rank directly captures in how many directions a varying of the parameters will also result in a varying of the underlying quantum state.…”
Section: Analyzing Quantum Learning Modelsmentioning
confidence: 99%