2017
DOI: 10.48550/arxiv.1704.01552
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Deep Learning and Quantum Entanglement: Fundamental Connections with Implications to Network Design

Abstract: Deep convolutional networks have witnessed unprecedented success in various machine learning applications. Formal understanding on what makes these networks so successful is gradually unfolding, but for the most part there are still significant mysteries to unravel. The inductive bias, which reflects prior knowledge embedded in the network architecture, is one of them. In this work, we establish a fundamental connection between the fields of quantum physics and deep learning. We use this connection for asserti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
54
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 37 publications
(55 citation statements)
references
References 29 publications
1
54
0
Order By: Relevance
“…Tensor networks [1][2][3][4][5][6][7][8][9][10][11][12][13][14][15][16] offer an efficient representation of otherwise exponentially complex objects, such as the manybody wavefunction of a quantum system. Over the last two decades, tensor networks have become of importance in an ever growing number of research areas, including quantum information and condensed matter [1][2][3][4][5][6][7][8][9][10][11], quantum chemistry [17][18][19][20], statistical mechanics [21][22][23][24], quantum gravity [25][26][27][28][29], and machine learning [30][31][32][33][34]. Two particularly useful tensor networks are the matrix product state (MPS) [1][2][3]…”
mentioning
confidence: 99%
“…Tensor networks [1][2][3][4][5][6][7][8][9][10][11][12][13][14][15][16] offer an efficient representation of otherwise exponentially complex objects, such as the manybody wavefunction of a quantum system. Over the last two decades, tensor networks have become of importance in an ever growing number of research areas, including quantum information and condensed matter [1][2][3][4][5][6][7][8][9][10][11], quantum chemistry [17][18][19][20], statistical mechanics [21][22][23][24], quantum gravity [25][26][27][28][29], and machine learning [30][31][32][33][34]. Two particularly useful tensor networks are the matrix product state (MPS) [1][2][3]…”
mentioning
confidence: 99%
“…The q-CNN architecture discussed in this work is based on the theoretical architecture, named deep convolutional arithmetic circuit, introduced in [23,24]. The product pooling proposed in [23,24] presents a challenge in training the network described above, since it can easily lead to numerical instabilities such as underflow or overflow. We would like to render the network trainable in practice, and do so without spoiling the analogy to quantum many-body systems.…”
Section: Related Workmentioning
confidence: 99%
“…In [31], the entanglement entropy of the final trained network was computed for a very different architecture directly related to tensor networks, but not its evolution during training and the correlation between entanglement and accuracy. In [32] and [24], the possibility was mentioned that the requirement of being capable to accommodate the entanglement could be used to guide the design of the network. We note that the values of the entanglement entropy in our experiments on the MNIST and F-MNIST datasets is of the order of log 2.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations