2019 IEEE 58th Conference on Decision and Control (CDC) 2019
DOI: 10.1109/cdc40024.2019.9030219
|View full text |Cite
|
Sign up to set email alerts
|

Deep Convolutional Networks in System Identification

Abstract: Recent developments within deep learning are relevant for nonlinear system identification problems. In this paper, we establish connections between the deep learning and the system identification communities. It has recently been shown that convolutional architectures are at least as capable as recurrent architectures when it comes to sequence modeling tasks. Inspired by these results we explore the explicit relationships between the recently proposed temporal convolutional network (TCN) and two classic system… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
33
0
3

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 52 publications
(36 citation statements)
references
References 51 publications
0
33
0
3
Order By: Relevance
“…From a practical perspective, the effectiveness of deep CNN architectures for different system identification and time series modeling tasks has been demonstrated in several contributions. [3][4][5] Even though mathematically it is not clear whether increasing the number of hidden layers extends the class of dynamics that can be represented by CNNs, experimentally it has been observed that deeper networks are able to learn more complex dependencies than shallower ones, for a given number of training parameters and for a given computational effort.…”
Section: Representational Power Of Dynonet Architecturesmentioning
confidence: 99%
See 1 more Smart Citation
“…From a practical perspective, the effectiveness of deep CNN architectures for different system identification and time series modeling tasks has been demonstrated in several contributions. [3][4][5] Even though mathematically it is not clear whether increasing the number of hidden layers extends the class of dynamics that can be represented by CNNs, experimentally it has been observed that deeper networks are able to learn more complex dependencies than shallower ones, for a given number of training parameters and for a given computational effort.…”
Section: Representational Power Of Dynonet Architecturesmentioning
confidence: 99%
“…Among the layers routinely applied in DL, 1D convolution 3 is the closest match. In particular, the 1D causal convolution layer 4,5 corresponds to the filtering of an input sequence through a causal finite impulse response (FIR) dynamical system. The dynoNet architecture may be seen as a generalization of the causal 1D convolutional neural network (CNN) enabling IIR filtering, owing to the description of the dynamical layers as rational transfer functions.…”
Section: Introductionmentioning
confidence: 99%
“…Inspired by this observation, an alternative approach to the given attitude estimation task is to train a neural network end-to-end on the raw IMU data of a large variety of experimental datasets with ground truth measurements. Considering the success of neural networks in other system identification tasks [18,19], it seems promising to employ them for robust attitude estimation.…”
Section: The Potential Of Neural Network In Inertial Attitude Estimationmentioning
confidence: 99%
“…TCNs are stateless feed-forward neural networks [18], which are able to model dynamic systems by processing windows of a fixed size at once instead of samples sequentially. Transformers are the current state-of-the-art architectures for natural language processing, because of their ability to process relations between two distant points in time [30].…”
Section: Choice Of the Neural Network Structurementioning
confidence: 99%
“…Recently, the connection between nn and nonlinear system identification of black-box model has been highlighted, e.g., [Andersson et al, 2019, 11-19 Dec, Ljung et al, 2020, Schoukens and Ljung, 2019. In Ljung et al [2020] and Andersson et al [2019, 11-19 Dec] feedforward nns are presented as a special case of the nonlinear autoregressive models (narx) where multiple narx models are stacked on top of each other.…”
mentioning
confidence: 99%