2018
DOI: 10.1103/physreve.97.053304
|View full text |Cite
|
Sign up to set email alerts
|

Scale-invariant feature extraction of neural network and renormalization group flow

Abstract: Theoretical understanding of how a deep neural network (DNN) extracts features from input images is still unclear, but it is widely believed that the extraction is performed hierarchically through a process of coarse graining. It reminds us of the basic renormalization group (RG) concept in statistical physics. In order to explore possible relations between DNN and RG, we use the restricted Boltzmann machine (RBM) applied to an Ising model and construct a flow of model parameters (in particular, temperature) g… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

10
99
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 73 publications
(109 citation statements)
references
References 41 publications
10
99
0
Order By: Relevance
“…When a probability distribution p is given, we can obtain a concrete example of the spin configurations by replacing the expectation value v i p at each site with ±1 with the probability (1 ± v i p )/2. Then the flow of probability distribution can be regarded as the flow of the spin configurations and it can be thought as an "RBM flow" of spin configurations as in [34]. The RBM learning and the process of reconstruction under the learned distributions to produce the RBM flow are schematically presented in Fig.…”
Section: B Rbm Flow Of Configurationsmentioning
confidence: 99%
See 1 more Smart Citation
“…When a probability distribution p is given, we can obtain a concrete example of the spin configurations by replacing the expectation value v i p at each site with ±1 with the probability (1 ± v i p )/2. Then the flow of probability distribution can be regarded as the flow of the spin configurations and it can be thought as an "RBM flow" of spin configurations as in [34]. The RBM learning and the process of reconstruction under the learned distributions to produce the RBM flow are schematically presented in Fig.…”
Section: B Rbm Flow Of Configurationsmentioning
confidence: 99%
“…Furthermore, in [34] it has been shown that the critical fixed point of the RG flow and the fixed point of data reconstruction by the ML techniques are coincident. The restricted Boltzmann machine (RBM) [35] plays a fundamental role in this task.…”
Section: Introductionmentioning
confidence: 99%
“…The RBM is trained for the 1-and 2-dimensional Ising models with volumes L = 6, L 2 = 8 × 8 and L 2 = 16 × 16, at various values of temperature. The Ising model and its properties are a paradigm in statistical physics literature, making it a suitable system on which to examine the training procedure and limitations of the RBMs [9,14,16,20]. Moreover, it is possible to generate a large number of Ising configurations using simple Monte Carlo simulations, avoiding the problem of small training sets.…”
Section: Introductionmentioning
confidence: 99%
“…(variational, exact and tractable likelihood, principled structure design via information theory) and high computational efficiency. The NeuralRG approach is closer in spirit to the original proposal based on Bayesian net [20] than more recent discussions on Boltzmann Machines [21,23] and Principal Component Analysis [22]. Figure 1(a) shows the proposed architecture.…”
mentioning
confidence: 97%