2021
DOI: 10.1002/nme.6831
|View full text |Cite
|
Sign up to set email alerts
|

Nonlinear dimensionality reduction for parametric problems: A kernel proper orthogonal decomposition

Abstract: Reduced-order models are essential tools to deal with parametric problems in the context of optimization, uncertainty quantification, or control and inverse problems. The set of parametric solutions lies in a low-dimensional manifold (with dimension equal to the number of independent parameters) embedded in a large-dimensional space (dimension equal to the number of degrees of freedom of the full-order discrete model). A posteriori model reduction is based on constructing a basis from a family of snapshots (so… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
14
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 19 publications
(14 citation statements)
references
References 20 publications
0
14
0
Order By: Relevance
“…The analysis for all possible combinations of parameters, along with the required nonlinear system of equations with a high number of degrees of freedom per each case, motivates the use of a ROM. The proposal presented here is to use a posteriori ROM, following the ideas in Díez et al (2021).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The analysis for all possible combinations of parameters, along with the required nonlinear system of equations with a high number of degrees of freedom per each case, motivates the use of a ROM. The proposal presented here is to use a posteriori ROM, following the ideas in Díez et al (2021).…”
Section: Methodsmentioning
confidence: 99%
“…The number of parameters in the model (six in total) motivates the use of a ROM for multiple evaluation. We study the performance of standard POD and local POD, accounting for both the original snapshots and new, quadratically generated, ones (Díez et al 2021).…”
Section: Introductionmentioning
confidence: 99%
“…where ε(t) denotes the error associated with the projection of s(t) − s ref onto the linear subspace V. While many different manifold constructs are conceivable, we focus on a polynomial mapping between the highdimensional data samples and their lower-dimensional representations. Explicit nonlinear mappings with a polynomial structure were originally proposed in manifold learning [42], but similar formulations have emerged recently in model reduction [2,3,5,43,44]. Our attention will be restricted to the case of quadratic Kronecker products:…”
Section: Data-driven Quadratic Solution-manifoldsmentioning
confidence: 99%
“…The mapping can be constructed in non-intrusive fashion using linear regression and is driven by physics-based training data. A similar concept was also explored in [5], where a kernel principal component analysis is used to find a nonlinear manifold with lower dimensionality. In that work, the approximation space is enriched with element-wise cross-products of the snapshots, thereby establishing globally curved manifolds.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation