2019
DOI: 10.48550/arxiv.1907.12998
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Approximation Capabilities of Neural ODEs and Invertible Residual Networks

Abstract: Neural Ordinary Differential Equations have been recently proposed as an infinitedepth generalization of residual networks. Neural ODEs provide out-of-the-box invertibility of the mapping realized by the neural network, and can lead to networks that are more efficient in terms of computational time and parameter space. Here, we show that a Neural ODE operating on a space with dimensionality increased by one compared to the input dimension is a universal approximator for the space of continuous functions, at th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
27
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(27 citation statements)
references
References 7 publications
0
27
0
Order By: Relevance
“…On the expressive power of residual flows, all existing theoretical analysis present negative results for these models (Zhang et al, 2019;Koehler et al, 2020;Kong and Chaudhuri, 2020). These results indicate residual flows are either unable to express certain functions, or unable to approximate certain distributions even with large depths.…”
Section: Related Workmentioning
confidence: 98%
See 2 more Smart Citations
“…On the expressive power of residual flows, all existing theoretical analysis present negative results for these models (Zhang et al, 2019;Koehler et al, 2020;Kong and Chaudhuri, 2020). These results indicate residual flows are either unable to express certain functions, or unable to approximate certain distributions even with large depths.…”
Section: Related Workmentioning
confidence: 98%
“…In the literature of normalizing flows, there are universal approximation results for several models including autoregressive flows (Germain et al, 2015;Kingma et al, 2016;Papamakarios et al, 2017;Huang et al, 2018;Jaini et al, 2019), coupling flows (Teshima et al, 2020;Koehler et al, 2020), and augmented normalizing flows (Zhang et al, 2019;Huang et al, 2020) 1 . There is also a continuoustime generalization of normalizing flows called neural ODEs (Chen et al, 2018;Dupont et al, 2019) with a universal approximation result (Zhang et al, 2019). We do not consider these flows in this paper.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…In this sense, an alternative architecture which ensures the existence and continuity of φ −1 would be appealing. ODE-nets enjoys such property and have been proven to be universal approximators for homeomorphisms [63]. However, the development and implementation of ODE-nets is still in its infancy so we did not investigate this further.…”
Section: Approximation Of the Reduced Mapmentioning
confidence: 99%
“…Subsequently, numerous ODE and PDE-based network architectures [4,12,14,22,48,61,69,70], and continuous-time recurrent units [7,16,43,[58][59][60] have been proposed. Exploiting the connections between ODE-Nets and the theory of dynamical systems and control is an active area of research [50,53,68], that has also motivated the development of more memory efficient training strategies [13,19,20,54,72] for ODE-Nets. Other research fronts include normalizing flows [21,36,65], and stochastic differential equations [23,32,34,42,45].…”
Section: Related Workmentioning
confidence: 99%