2020
DOI: 10.1016/j.cma.2020.112947
|View full text |Cite
|
Sign up to set email alerts
|

A non-intrusive multifidelity method for the reduced order modeling of nonlinear problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
31
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 53 publications
(32 citation statements)
references
References 25 publications
1
31
0
Order By: Relevance
“…In [13,14], latent dynamics and nonlinear mappings are modelled as NODEs and autoencoders, respectively; in [4,[71][72][73], autoencoders are used to learn approximate invariant subspaces of the Koopman operator. Relatedly, there have been studies on learning direct mappings via, for example, a neural network, from parameters (including time parameters) to either latent states or approximate solution states [74][75][76][77][78], where the latent states are computed by using autoencoders or linear POD.…”
Section: Related Work (A) Classical Data-driven Surrogate Modellingmentioning
confidence: 99%
“…In [13,14], latent dynamics and nonlinear mappings are modelled as NODEs and autoencoders, respectively; in [4,[71][72][73], autoencoders are used to learn approximate invariant subspaces of the Koopman operator. Relatedly, there have been studies on learning direct mappings via, for example, a neural network, from parameters (including time parameters) to either latent states or approximate solution states [74][75][76][77][78], where the latent states are computed by using autoencoders or linear POD.…”
Section: Related Work (A) Classical Data-driven Surrogate Modellingmentioning
confidence: 99%
“…Since the low-fidelity evaluations can be computationally cheap, the offline efficiency can be effectively improved by reducing the number of high-fidelity evaluations. Kast et al 66 employ the linear model of coregionalization 67 that expresses the prior of a hierarchy of M solution fidelities as…”
Section: Multifidelity Regressionmentioning
confidence: 99%
“…The high-fidelity RB coefficients are defined by projecting the high-fidelity solution vectors onto the high-fidelity RB. Two alternative ways of evaluating low-fidelity RB coefficients have been given by Kast et al 66 One choice is to project the low-fidelity solution vectors onto a set of POD basis extracted from the low-fidelity snapshots, while the other is to project a reconstructed solution 69 from the bifidelity data set onto the high-fidelity RB. In the latter, the reconstructionq at each parameter location h is defined as a linear combination of the high-fidelity snapshots S H , i.e.…”
Section: Multifidelity Regressionmentioning
confidence: 99%
See 1 more Smart Citation
“…However, as mentioned above, POD (and DEIM) needs a number of FOM simulations to construct the ROM. For this reason, [42] implemented a multi-fidelity strategy in which the parametric dependence was reconstructed using a large number of low-fidelity models and a minimal number of high-fidelity evaluations. Other approaches exploit machine learning to construct an input-output relationship, with convolutional neural networks [43] and autoencoders [44], which require the training of a network, again, using preexisting data.…”
Section: Introductionmentioning
confidence: 99%