2021
DOI: 10.3389/fncom.2021.678158
|View full text |Cite
|
Sign up to set email alerts
|

Gated Recurrent Units Viewed Through the Lens of Continuous Time Dynamical Systems

Abstract: Gated recurrent units (GRUs) are specialized memory elements for building recurrent neural networks. Despite their incredible success on various tasks, including extracting dynamics underlying neural data, little is understood about the specific dynamics representable in a GRU network. As a result, it is both difficult to know a priori how successful a GRU network will perform on a given task, and also their capacity to mimic the underlying behavior of their biological counterparts. Using a continuous time ana… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 26 publications
(12 citation statements)
references
References 34 publications
0
12
0
Order By: Relevance
“…[9] also recently proposed a continuous time recurrent network for more stable learning, without event-based mechanics. GRUs were formulated in continuous time in [33], but purely for analyzing its autonomous dynamics.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…[9] also recently proposed a continuous time recurrent network for more stable learning, without event-based mechanics. GRUs were formulated in continuous time in [33], but purely for analyzing its autonomous dynamics.…”
Section: Related Workmentioning
confidence: 99%
“…However, in general it is possible to express the GRU dynamics for an arbitrary time step ∆t, with t n = t n−1 +∆t. The discrete time GRU dynamics can be intuitively interpreted as an Euler discretization of an ordinary differential equation (ODE) [33] (see Supplement), which we extend further to formulate the EGRU. This is equivalent to taking the continuous time limit ∆t → 0 to get dynamics for the internal state c(t) starting from the discrete time EGRU model outlined above.…”
Section: Limit To Continuous Timementioning
confidence: 99%
“…Continuous trajectories have been effective descriptions in motor (Churchland et al, 2012), cognitive (Sohn et al, 2019), and sensory cortices (Chowdhury et al, 2020) -leading to the concept of neural manifolds (Jazayeri and Afraz, 2017). However, continuous trajectories and dynamics do not preclude metastability, since continuous dynamical system features such as hyperbolic fixed points and multistable limit cycles can exist (Zhao and Park, 2016;Jordan et al, 2021). We further discuss methods that can analyze continuous trajectories in Section 5.1.…”
Section: Other Types Of Observed Neural Dynamicsmentioning
confidence: 99%
“…In the low-dimensional models, the expressive power of the specific parameterization of f must be high enough to capture metastable dynamics. Radial basis function networks, Gaussian processes with squareexponential kernels, linear-nonlinear forms with hyperbolic tangent function, switching linear dynamical systems, and gated recurrent units were investigated as flexible methods of parameterizing f and shown to have sufficient expressive power in the low-dimensional regime (Zhao and Park, 2016;Duncker et al, 2019;Jordan et al, 2021;Nassar et al, 2019;Zhao et al, 2019). Due to the high flexibility of the functional form, it is important to put sufficient emphasis on simpler, more robustly generalizing functions.…”
Section: Latent Nonlinear Continuous Dynamical Systems Modelingmentioning
confidence: 99%
“…In some recent papers (de Brouwer et al, 2019;Jordan et al, 2019) continuous-time ODE approximations of discrete RNN were sought based on inverting the forward Euler rule for numerically solving continuous ODE systems. A related idea is that of Neural ODE (Chen et al, 2018), where the flow is given by a (deep) neural network (cf.…”
Section: Related Workmentioning
confidence: 99%