“…Recently, (Wiatowski & Bölcskei, 2015) considered Mallat-type networks with arbitrary filters (that may be learned or pre-specified), general Lipschitz-continuous non-linearities (e.g., rectified linear unit, shifted logistic sigmoid, hyperbolic tangent, and the modulus function), and a continuous-time pooling operator that amounts to a dilation. The essence of the results in (Wiatowski & Bölcskei, 2015) is that vertical (i.e., asymptotically in the network depth) translation invariance and Lipschitz continuity of the feature extractor are induced by the network structure per se rather than the specific choice of filters and non-linearities. For band-limited signals (Wiatowski & Bölcskei, 2015), cartoon functions (Grohs et al, 2016), and Lipschitz-continuous functions (Grohs et al, 2016), Lipschitz continuity of the feature extractor automatically leads to bounds on deformation sensitivity.…”