2020
DOI: 10.1016/j.ifacol.2020.12.1329
|View full text |Cite
|
Sign up to set email alerts
|

Deep Learning and System Identification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
60
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 122 publications
(61 citation statements)
references
References 9 publications
1
60
0
Order By: Relevance
“…Therefore, we choose n d 5 as a fixed value for two reasons: first, it guaranteed a robust fitting of the testing data even when the size of the training set was reduced, thus avoiding overfitting issues (Table 3). On the other hand, n d 5 allowed to obtain better interpretable models with respect to system identification approaches, e.g., neural networks or deep learning, for which the goodness of fitting is favored with respect to the model interpretability (Ljung et al, 2020).…”
Section: Discussionmentioning
confidence: 99%
“…Therefore, we choose n d 5 as a fixed value for two reasons: first, it guaranteed a robust fitting of the testing data even when the size of the training set was reduced, thus avoiding overfitting issues (Table 3). On the other hand, n d 5 allowed to obtain better interpretable models with respect to system identification approaches, e.g., neural networks or deep learning, for which the goodness of fitting is favored with respect to the model interpretability (Ljung et al, 2020).…”
Section: Discussionmentioning
confidence: 99%
“…18). The principal idea is to seek a system identification operator [29] that corrects the amplitude residuals in bulk quantities which the CRAN architecture develops. We achieve this by learning the initial 300 performance steps and forecasting all 500 steps.…”
Section: Near-wake Dynamics Using the Cnn-rnn Drivermentioning
confidence: 99%
“…Although neural networks that are based on convolutional or recurrent layers have inherent capabilities to model dynamic systems, only a limited amount of work omitted the autoregression. It has been shown, that RNNs behave like an NLSS (Ljung et al, 2020) and an RNN has been applied non-autoregressively to a synthetic dataset (Gonzalez and Yu, 2018). The authors of the present paper applied nonautoregressive TCNs and gated recurrent units (GRUs) to an inertial measurement-based sensor fusion task, outperforming state-of-the-art domain-specific sensor fusion methods (Weber et al, 2020).…”
Section: Introductionmentioning
confidence: 98%
“…Most current neural network-based system identification methods still are NARX variants with different neural network architectures as nonlinearities. In related work, a multitude of system identification methods that are based on autoregressive neural networks has been proposed using multilayer perceptrons (MLPs) (Shi et al, 2019), cascaded MLPs (Ljung et al, 2020), convolutional neural networks (Lopez and Yu, 2017), TCNs (Andersson et al, 2019), and recurrent neural networks (Kumar et al, 2019) with promising results.…”
Section: Introductionmentioning
confidence: 99%