2022
DOI: 10.4171/mag/101
|View full text |Cite
|
Sign up to set email alerts
|

Tensor networks in machine learning

Abstract: A tensor network is a type of decomposition used to express and approximate large arrays of data. A given dataset, quantum state, or higher-dimensional multilinear map is factored and approximated by a composition of smaller multilinear maps. This is reminiscent to how a Boolean function might be decomposed into a gate array: this represents a special case of tensor decomposition, in which the tensor entries are replaced by 0, 1 and the factorisation becomes exact. The associated techniques are called tensor n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 40 publications
0
5
0
Order By: Relevance
“…Tensor networks have been used to describe and predict quantum states [49]. It has also been used to study low-energy multibody quantum entanglement of local Hamiltonians [50], and to study ground states dynamics and interactions in quantum chemistry [51], [52].…”
Section: Tensor Networkmentioning
confidence: 99%
“…Tensor networks have been used to describe and predict quantum states [49]. It has also been used to study low-energy multibody quantum entanglement of local Hamiltonians [50], and to study ground states dynamics and interactions in quantum chemistry [51], [52].…”
Section: Tensor Networkmentioning
confidence: 99%
“…Yet, as Ref. [144] mentions, machine learning can aid, in turn, in determining a factorization of a TN approximating a data set. Moreover, TNs are also used to compress the layers of ANN architectures, besides a variety of other uses.…”
Section: Tns As a Generalization Of The Main Model Architectures In MLmentioning
confidence: 99%
“…Yet, as Ref. [52] mentions, machine learning can aid, in turn, in determining a factorization of a TN approximating a data set. Moreover, TNs are also used to compress the layers of ANN architectures, besides a variety of other uses.…”
Section: Tns As a Generalization Of The Main Model Architectures In MLmentioning
confidence: 99%