2021
DOI: 10.48550/arxiv.2105.14434
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Provable superior accuracy in machine learned quantum models

Abstract: In modelling complex processes, the potential past data that influence future expectations are immense. Models that track all this data are not only computationally wasteful but also shed little light on what past data most influence the future. There is thus enormous interest in dimensional reduction -finding automated means to reduce the memory dimension of our models while minimizing its impact on its predictive accuracy. Here we construct dimensionally reduced quantum models by machine learning methods tha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 51 publications
0
8
0
Order By: Relevance
“…Finally, it is important to identify instances in which the proposed model class may offer improvements as compared to classical gated architectures, such as the gated LSTM model. An interesting direction could be the investigation of the memory cost of time warpinginvariant quantum recurrent models, since quantum models have been shown to describe temporal sequences with reduced memory [25,24]. Given the performance of the proposed model on prediction tasks, we expect it to be advantageous for tasks related to quantum phenomena, with extensions possibly operating directly on quantum data [39].…”
Section: Discussion and Future Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Finally, it is important to identify instances in which the proposed model class may offer improvements as compared to classical gated architectures, such as the gated LSTM model. An interesting direction could be the investigation of the memory cost of time warpinginvariant quantum recurrent models, since quantum models have been shown to describe temporal sequences with reduced memory [25,24]. Given the performance of the proposed model on prediction tasks, we expect it to be advantageous for tasks related to quantum phenomena, with extensions possibly operating directly on quantum data [39].…”
Section: Discussion and Future Workmentioning
confidence: 99%
“…In this section, we present an alternative derivation of the TWI-QRNN model that follows directly the steps in [7]. To start, we introduce a continuous-time version of the update (7) via the approximation ρ(t + δt) ≈ ρ(t) + dρ(t) dt δt, (25) where we have used ρ(t) to denote a continuoustime version of the density matrix and δt > 0 is a time interval. The update (7) can be viewed as the standard Euler approximation of (25).…”
Section: B Alternative Derivation Of Twi-qrnnsmentioning
confidence: 99%
See 1 more Smart Citation
“…This is particularly useful for running the models on quantum hardware, rather than simulators, and it is also left for future work. An interesting direction could be the investigation of the memory cost of time warping-invariant quantum recurrent models, since quantum models have been shown to describe temporal sequences with reduced memory [24,25]. Another interesting application is the use of the proposed model for quantum optimal control problems, which have already been treated using classical LSTMs [40,41].…”
Section: Discussion and Future Workmentioning
confidence: 99%
“…The model has been shown to be capable of learning sequential data [22], and its capacity was analyzed in [23] in comparison to counterpart classical models with the same number of memory units. In particular, it was shown in [24,25] that quantum models can describe temporal sequences with reduced memory, compared to their classical counterparts. A model with classical memory, was introduced in [26] by integrating quantum models within a classical LSTM architecture.…”
Section: Related Workmentioning
confidence: 99%