2022
DOI: 10.1109/tpami.2020.3032422
|View full text |Cite
|
Sign up to set email alerts
|

Depth Selection for Deep ReLU Nets in Feature Extraction and Generalization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
31
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 27 publications
(31 citation statements)
references
References 44 publications
0
31
0
Order By: Relevance
“…But their results are slightly worse than Theorem 2.1 in this paper, in the sense that there is either an additional logarithmic term or under the weaker norm. Recently, Han et al [17] indicated that deep ReLU nets can achieve the optimal generalization performance for numerous learning tasks, but the depth of [31] is much larger than ours. Recently, Zhou [35,36] also verified that deep convolutional neural network (DCNN) is universal, i.e., DCNN can be used to approximate any continuous function to an arbitrary accuracy when the depth of the neural network is large enough.…”
Section: Related Workmentioning
confidence: 62%
See 3 more Smart Citations
“…But their results are slightly worse than Theorem 2.1 in this paper, in the sense that there is either an additional logarithmic term or under the weaker norm. Recently, Han et al [17] indicated that deep ReLU nets can achieve the optimal generalization performance for numerous learning tasks, but the depth of [31] is much larger than ours. Recently, Zhou [35,36] also verified that deep convolutional neural network (DCNN) is universal, i.e., DCNN can be used to approximate any continuous function to an arbitrary accuracy when the depth of the neural network is large enough.…”
Section: Related Workmentioning
confidence: 62%
“…All the above literature [34,31,17,35,36] demonstrated that deep nets with ReLU activation function and DCNN have good properties both in approximation and generalization. However, there are too deep to be particularly used in real tasks.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…While it is known that a relation does exist between the network depth and the size of the convolutional filters (and consequently the receptive field) [26], the question of the necessity of depth has not been investigated much. In [27], the authors proposed quantification of the correspondence between features learnt by the network and its depth. DnCNN [12] has been designed following this approach.…”
Section: Introductionmentioning
confidence: 99%