2020
DOI: 10.48550/arxiv.2012.11841
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Residual Matrix Product State for Machine Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(7 citation statements)
references
References 20 publications
0
7
0
Order By: Relevance
“…which have been used in other implementations of tensor network regression [4,15,16]. When equation ( 7) is used to construct X, the regression function ⃗ f (⃗ x; W) from equation (6) becomes…”
Section: Regression With Tensorsmentioning
confidence: 99%
See 1 more Smart Citation
“…which have been used in other implementations of tensor network regression [4,15,16]. When equation ( 7) is used to construct X, the regression function ⃗ f (⃗ x; W) from equation (6) becomes…”
Section: Regression With Tensorsmentioning
confidence: 99%
“…This could be viewed as a form of data tensor preprocessing, where elements of X corresponding to interaction degrees other than j are removed. Alternatively, the tensor diagrams in equation (16) show that it is equally valid to consider P ( j) acting on the weight tensor W. Under this interpretation, the set { ⃗ d ( j) (⃗ x; W)} m j=0 consists of regressions on X performed by different models, each derived from a common tensor W by keeping only those elements corresponding to interactions of degree j. We will shift between these two interpretations freely throughout the remainder of this section, describing the interaction decomposition as a procedure which picks out different pieces of a tensor network model and which restricts the set of feature products that can be used for regression.…”
Section: Interaction Subspacesmentioning
confidence: 99%
“…We will now discuss the application of deep tensor networks on the MNIST and Fashion MNIST image classification tasks. Tensor networks have previously been applied to both tasks with moderate success [1,2,3,4,5,6,7,8,36]. The best results using tensor network methods in combination with neural networks are 0.69% error rate on MNIST [36] (CNN+PEPS) and 8.9% error rate on Fashion MNIST [36] (CNN+PEPS).…”
Section: Image Classification With Deep Tensor Networkmentioning
confidence: 99%
“…Motivated by the successful approximation of quantum amplitudes, tensor networks have also been applied to machine learning problems (e.g. image classification [1,2,3,4,5,6,7,8], generative modelling [9,10,11,12], sequence and language modelling [13,14,15,16], anomaly detection [17,18]). Adopting tensor network methods in machine learning enables compression of neural networks [19,20,21], adaptive training algorithms [22], derivation of interesting generalisation bounds [15], information theoretical insight [23,24,25,26], and new connections between machine learning and physics [27,28,29].…”
Section: Introductionmentioning
confidence: 99%
“…Thanks to the rapid progress of computer technology, data in tensor format (i.e., multi-dimensional array) are emerging in computer vision, machine learning, remote sensing, quantum physics, and many other fields, triggering an increasing need for tensor-based learning theory and algorithms [1][2][3][4][5][6]. In this paper, we carry out both theoretic and algorithmic research studies on tensor recovery from linear observations, which is a typical problem in tensor learning aiming to learn an unknown tensor when only a limited number of its noisy linear observations are available [7].…”
Section: Introductionmentioning
confidence: 99%