2020
DOI: 10.4208/cicp.oa-2020-0085
|View full text |Cite
|
Sign up to set email alerts
|

Frequency Principle: Fourier Analysis Sheds Light on Deep Neural Networks

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

10
124
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 217 publications
(134 citation statements)
references
References 19 publications
10
124
0
Order By: Relevance
“…These works have signified the importance of the F-Principle. Theoretically, Xu et al [32] propose a theorem for the characterization of the initial training stage of a two-layer tanh network, which is also adopted in the analysis of DNNs with ReLU activation function [23]. Another series of works [4,6,24,34,36] attempt to understand the F-Principle in very wide neural networks, which can be well approximated by the first-order expansion with respect to the network parameters (the linear neural tangent kernel (NTK) regime).…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations
“…These works have signified the importance of the F-Principle. Theoretically, Xu et al [32] propose a theorem for the characterization of the initial training stage of a two-layer tanh network, which is also adopted in the analysis of DNNs with ReLU activation function [23]. Another series of works [4,6,24,34,36] attempt to understand the F-Principle in very wide neural networks, which can be well approximated by the first-order expansion with respect to the network parameters (the linear neural tangent kernel (NTK) regime).…”
Section: Introductionmentioning
confidence: 99%
“…In this work, we take another approach that uses Fourier analysis to study the learning behavior of DNNs based on the phenomenon of Frequency Principle (F-Principle), i.e., a DNN tends to learn a target function from low to high frequencies during the training [23,31,32,36]. Empirically, the F-Principle can be widely observed in general DNNs for both benchmark and synthetic data [31,32]. Conceptually, it provides a qualitative explanation of the success and failure of DNNs [32].…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Moreover, steganalysis pays more attention to the high-frequency texture information of the image, which is the position where the secret information embedding probability is higher in the adaptive steganography algorithm. According to the frequency principle [20], the deeper the network, the more likely it is to learn low-frequency image content information, which is inconsistent with the goal of steganalysis. Therefore, a too deep network limits the improvement of steganographic detection performance.…”
Section: Introductionmentioning
confidence: 99%