2021
DOI: 10.48550/arxiv.2106.13898
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Closed-form Continuous-time Neural Models

Abstract: Continuous-depth neural models, where the derivative of the model's hidden state is defined by a neural network, have enabled strong sequential data processing capabilities. However, these models rely on advanced numerical differential equation (DE) solvers resulting in a significant overhead both in terms of computational cost and model complexity. In this paper, we present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
2

Relationship

3
3

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 31 publications
0
8
0
Order By: Relevance
“…Although GoTube is considerably more computationally efficient than existing methods, the dimensionality of the system-under-test as well as the type of numerical ODE solver exponentially affect their performance. We can improve on this limitation by using Hypersolvers (Poli et al, 2020), closed-form continuous depth models (Hasani et al, 2021a), and compressed representations of neural ODEs (Liebenwein et al, 2021).…”
Section: Discussion Scope and Conclusionmentioning
confidence: 99%
“…Although GoTube is considerably more computationally efficient than existing methods, the dimensionality of the system-under-test as well as the type of numerical ODE solver exponentially affect their performance. We can improve on this limitation by using Hypersolvers (Poli et al, 2020), closed-form continuous depth models (Hasani et al, 2021a), and compressed representations of neural ODEs (Liebenwein et al, 2021).…”
Section: Discussion Scope and Conclusionmentioning
confidence: 99%
“…Other approaches include directly predicting the resulting value at a variable time (Mattheakis, Joy, and Protopapas 2021;Chen et al 2020a) and architectures with closed-form time propagation (Hasani et al 2021). We take the approach of augmenting a traditional solver, but only focus on Hamiltonian systems.…”
Section: Related Workmentioning
confidence: 99%
“…This category includes the UnICORNN [47] and its predecessor coRNN [46] which discretize a second-order ODE inspired by oscillatory systems. Other models include the Liquid Time-Constant Networks (LTC) [27] and successor CfC [26], which use underlying dynamical systems with varying time-constants with stable behavior and provable rates of expressivity measured by trajectory length. The LTC is based on earlier dynamic causal models (DCM) [21], which are a particular ODE related to state spaces with an extra bilinear term.…”
Section: A Related Workmentioning
confidence: 99%