2021
DOI: 10.1002/acs.3216
|View full text |Cite
|
Sign up to set email alerts
|

dynoNet: A neural network architecture for learning dynamical systems

Abstract: This article introduces a network architecture, called dynoNet, utilizing linear dynamical operators as elementary building blocks. Owing to the dynamical nature of these blocks, dynoNet networks are tailored for sequence modeling and system identification purposes. The back-propagation behavior of the linear dynamical operator with respect to both its parameters and its input sequence is defined. This enables end-to-end training of structured networks containing linear dynamical operators and other differenti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
16
0

Year Published

2021
2021
2025
2025

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 30 publications
(16 citation statements)
references
References 14 publications
0
16
0
Order By: Relevance
“…The second part is based on LTI dynamical operators, as implemented in the novel dynoNet 1 . LTI operators were shown to be efficient [2] when learning the complex causal non-linear dynamics, while being suitable for an end-to-end backpropagation. These properties are advantageous for learning the dynamics of the flexible-joint manipulator.…”
Section: B Two-stage Modelmentioning
confidence: 99%
See 3 more Smart Citations
“…The second part is based on LTI dynamical operators, as implemented in the novel dynoNet 1 . LTI operators were shown to be efficient [2] when learning the complex causal non-linear dynamics, while being suitable for an end-to-end backpropagation. These properties are advantageous for learning the dynamics of the flexible-joint manipulator.…”
Section: B Two-stage Modelmentioning
confidence: 99%
“…The architecture of a branch is as follows: first the LTI block with a = 2 and b = 2, which outputs 14 features (motivated to represent a rough position + velocity approximation). a and b define the polynomial order for the denominator and nominator of a rational transfer function (see [2] for details). The LTI block is followed by two fully connected layers.…”
Section: B Two-stage Modelmentioning
confidence: 99%
See 2 more Smart Citations
“…Other contributions propose new DL-based model structures explicitly conceived for SI purposes. For instance, [7] introduces a novel neural architecture which includes linear transfer functions as elementary building blocks, while [8] proposes model structures based on Koopman operator theory [9], and lift non-linear statespace dynamics to a higher-dimensional space where the dynamics are linear. The mapping from the original to the higher-dimensional space is learned using DL tools.…”
Section: Introductionmentioning
confidence: 99%