2018
DOI: 10.1103/physreve.98.042114
|View full text |Cite
|
Sign up to set email alerts
|

Matrix product operators for sequence-to-sequence learning

Abstract: The method of choice to study one-dimensional strongly interacting many body quantum systems is based on matrix product states and operators. Such method allows to explore the most relevant, and numerically manageable, portion of an exponentially large space. It also allows to describe accurately correlations between distant parts of a system, an important ingredient to account for the context in machine learning tasks. Here we introduce a machine learning model in which matrix product operators are trained to… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
53
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 59 publications
(53 citation statements)
references
References 57 publications
0
53
0
Order By: Relevance
“…These results thus provide a direct means of using MPS methods to study the resource requirements of quantum stochastic simulation, extending their relevance to the field of predictive modeling. Our approach complements other uses of tensor network methods for the description of classical systems with stochastic elements [35][36][37][38][39][40][41] and machine learning [42][43][44][45][46][47][48][49][50]. We extend these results in introducing causal structure, adapting MPS methods for predictive modeling.Predictive models.-Consider a system that generates an output x t sampled from a random variable X t at each time t.…”
mentioning
confidence: 91%
“…These results thus provide a direct means of using MPS methods to study the resource requirements of quantum stochastic simulation, extending their relevance to the field of predictive modeling. Our approach complements other uses of tensor network methods for the description of classical systems with stochastic elements [35][36][37][38][39][40][41] and machine learning [42][43][44][45][46][47][48][49][50]. We extend these results in introducing causal structure, adapting MPS methods for predictive modeling.Predictive models.-Consider a system that generates an output x t sampled from a random variable X t at each time t.…”
mentioning
confidence: 91%
“…[41]). As shown in the previous works [15,25,26,[28][29][30][31][32], TN models (including MPS) possess remarkable generalization power that is competitive to neural networks. Notably, TN models surpass neural networks as they possess high interpretability and allow us to implement quantum processes.…”
Section: Tensor-network Compressed Sensingmentioning
confidence: 72%
“…Recently, TN [20][21][22][23][24] is rapidly developed into a powerful quantum-inspired computational tool for machine learning, which brings new possibilities and wide perspectives to process real-life data, such as images and texts, in the quantum processes based on many-qubit (or many-body) states representing the probability distribution of the data that Alice considers to send, (2) encode the specific piece of information to be sent by projecting | , (3) decode the information as a generative process by the projected Born machine. [15,[25][26][27][28][29][30][31][32]. High efficiencies have been demonstrated at least for the classical simulations of these quantum processes.…”
Section: Introductionmentioning
confidence: 99%
“…Tensor networks have been successfully applied to several learning tasks including dimensionality reduction [250], unsupervised learning and generative modelling using matrix product states [251][252][253], representation learning with multi-scale tensor networks [254], sequence-to-sequence learning using matrix product operators [255], language modelling [256,257], Bayesian inference [258]. Ref.…”
Section: Quantum Physics-inspired Machine Learningmentioning
confidence: 99%