2021
DOI: 10.48550/arxiv.2109.00190
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Approximation Properties of Deep ReLU CNNs

Juncai He,
Lin Li,
Jinchao Xu

Abstract: This paper is devoted to establishing L 2 approximation properties for deep ReLU convolutional neural networks (CNNs) on two-dimensional space. The analysis is based on a decomposition theorem for convolutional kernels with large spatial size and multi-channel. Given that decomposition and the property of the ReLU activation function, a universal approximation theorem of deep ReLU CNNs with classic structure is obtained by showing its connection with ReLU deep neural networks (DNNs) with one hidden layer. Furt… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 30 publications
0
5
0
Order By: Relevance
“…An alternative is the convolutional autoencoder (CAE). Convolutional networks not only have the characteristics, such as the connectivity, the translation invariance and so on [17,28], but also has well approximation properties [3,20,38,48]. Refs.…”
Section: Convolutional Autoencoder Based Nonlinear Rommentioning
confidence: 99%
“…An alternative is the convolutional autoencoder (CAE). Convolutional networks not only have the characteristics, such as the connectivity, the translation invariance and so on [17,28], but also has well approximation properties [3,20,38,48]. Refs.…”
Section: Convolutional Autoencoder Based Nonlinear Rommentioning
confidence: 99%
“…Gühring, Kutyniok, and Petersen (2020); Gühring and Raslan (2021); Siegel and Xu (2022), and adapted to the case of CNNs exploiting some algebraic arguments that link dense and convolutional layers, see e.g. Zhou (2020); He, Li, and Xu (2021).…”
Section: Literature Reviewmentioning
confidence: 99%
“…Petersen and Voigtlaender (2020). Moreover, CNN models have been mostly studied for handling high-dimensional data at input and not as output, as in He et al (2021). As a consequence, the available literature is left with a missing piece, which is to understand the approximation properties of convolutional layers when reconstructing functional signals.…”
Section: Literature Reviewmentioning
confidence: 99%
“…An approximation theory for the 1-D DCNN with single channel induced by convolutional matrices like (4) have been extensively studied in a sequence of papers by Zhou et al [26,27,5,17]. Afterwards, Xu et al [9] extend to approximate L 2 functions by 2D CNNs with respect to odd filter size and stride one. It requires intermediate layers in the same dimension as the input through convolutions, which is slightly compatible with the practice.…”
Section: Letmentioning
confidence: 99%