2015
DOI: 10.48550/arxiv.1512.06293
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Mathematical Theory of Deep Convolutional Neural Networks for Feature Extraction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

1
41
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 15 publications
(42 citation statements)
references
References 0 publications
1
41
0
Order By: Relevance
“…First steps towards addressing this question and developing a mathematical theory of DCNNs for feature extraction were made-for the continuous-time case-in (Mallat, 2012;Wiatowski & Bölcskei, 2015). Specifically, (Mallat, 2012) analyzed so-called scattering networks, where signals are propagated through layers that employ directional wavelet filters and modulus non-linearities but no intra-layer pooling.…”
Section: Introductionmentioning
confidence: 99%
See 4 more Smart Citations
“…First steps towards addressing this question and developing a mathematical theory of DCNNs for feature extraction were made-for the continuous-time case-in (Mallat, 2012;Wiatowski & Bölcskei, 2015). Specifically, (Mallat, 2012) analyzed so-called scattering networks, where signals are propagated through layers that employ directional wavelet filters and modulus non-linearities but no intra-layer pooling.…”
Section: Introductionmentioning
confidence: 99%
“…The resulting wavelet-modulus feature extractor is horizontally (i.e., in every network layer) translation-invariant (accomplished by letting the wavelet scale parameter go to infinity) and deformationstable, both properties of significance in practical feature extraction applications. Recently, (Wiatowski & Bölcskei, 2015) considered Mallat-type networks with arbitrary filters (that may be learned or pre-specified), general Lipschitz-continuous non-linearities (e.g., rectified linear unit, shifted logistic sigmoid, hyperbolic tangent, and the modulus function), and a continuous-time pooling operator that amounts to a dilation. The essence of the results in (Wiatowski & Bölcskei, 2015) is that vertical (i.e., asymptotically in the network depth) translation invariance and Lipschitz continuity of the feature extractor are induced by the network structure per se rather than the specific choice of filters and non-linearities.…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations