2021
DOI: 10.1017/dce.2021.15
|View full text |Cite
|
Sign up to set email alerts
|

Model order reduction based on Runge–Kutta neural networks

Abstract: Model order reduction (MOR) methods enable the generation of real-time-capable digital twins, with the potential to unlock various novel value streams in industry. While traditional projection-based methods are robust and accurate for linear problems, incorporating machine learning to deal with nonlinearity becomes a new choice for reducing complex problems. These kinds of methods are independent to the numerical solver for the full order model and keep the nonintrusiveness of the whole workflow. Such methods … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 12 publications
(9 citation statements)
references
References 32 publications
0
9
0
Order By: Relevance
“…We plan to develop and employ suitable methods of model order reduction. A promising approach is MOR via artificial neural networks [18].…”
Section: Discussionmentioning
confidence: 99%
“…We plan to develop and employ suitable methods of model order reduction. A promising approach is MOR via artificial neural networks [18].…”
Section: Discussionmentioning
confidence: 99%
“…In this case the inputs to the Neural Network are the current state of the system y r,i and µ i , and the new reduced state y r,i+1 is calculated as y r,i+1 = y r,i + δtg ee (y r,i , µ i ) (11) this looks similar to an Explicit Euler integrator, so we denote the network's function as g ee (y r,i , µ i ). And we call such a network Explicit Euler Neural Network (EENN) [9,23]. From Equation 11, we know that an EENN is essentially a variant of residual network [24] which learns the increment between two system states instead of learning to map the new state directly from the old state.…”
Section: Rom Identification Using Annsmentioning
confidence: 99%
“…surrogate ROMs [9]. Other network architectures such as Long-short-term-memory (LSTM) Neural Network [10] and Recurrent Neural Network (RNN) [11,12] have also been tested for the purpose of reduced modelling.…”
Section: Introductionmentioning
confidence: 99%
“…Currently, most data-driven MOR methods are based on machine learning (ML). For example, Feed-forward Neural Network (Regazzoni et al, 2019), operator inference (OpInf) (Peherstorfer and Willcox, 2016), Long-short-term-memory (LSTM) neural network (Mohan and Gaitonde, 2018), recurrent neural network (RNN) (Kani and Elsheikh, 2017;Wang et al, 2020;Wu and Noels, 2022), deep learning (Fresca and Manzoni, 2022), sparse identification of nonlinear dynamics (SINDy) (Champion et al, 2019), and Runge-Kutta neural network (Zhuang et al, 2021) have been tested for reduced modeling.…”
Section: Introductionmentioning
confidence: 99%