2020
DOI: 10.1016/j.jcp.2020.109787
|View full text |Cite
|
Sign up to set email alerts
|

Gradient-based constrained optimization using a database of linear reduced-order models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
52
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 71 publications
(52 citation statements)
references
References 32 publications
0
52
0
Order By: Relevance
“…Optimization done using greedy methods may take more time than full HDM optimization. In Choi et al, 65 for one of the cases shown, a full HDM optimization took almost 1/50th of the time of the off-line basis building procedure. Adaptive sampling is on-line only: each call on the ROM might entail taking a new snapshot.…”
Section: Samplingmentioning
confidence: 99%
See 1 more Smart Citation
“…Optimization done using greedy methods may take more time than full HDM optimization. In Choi et al, 65 for one of the cases shown, a full HDM optimization took almost 1/50th of the time of the off-line basis building procedure. Adaptive sampling is on-line only: each call on the ROM might entail taking a new snapshot.…”
Section: Samplingmentioning
confidence: 99%
“…This estimator can then be used to address a known problem in greedy methods (mentioned in Paul-Dubois-Taine and Amsallem 64 and Choi et al 65 ) using the residual as error estimators, which is that the residual might not reflect the proper scale of the error. The leave-one-out error estimates error for a region of parameter values, while this proposed method estimates the error for a given set of parameter values.…”
Section: Error Estimation and Controlmentioning
confidence: 99%
“…where W w ∈ R N f ×n f is a left fluid ROB. Projecting next the overdetermined system (18) onto the subspace represented by the left fluid-structure ROB W q (19) transforms this system into the square counterpart…”
Section: Parametric Pmor For Fsi and Sensitivitiesmentioning
confidence: 99%
“…In  X , interpolate in the AS the computed matrix logarithms j = ( r j ), j = 1, … , N B , entry-by-entry, using weighted sums of radial basis functions applied directly to the parameter points r j , j = 1, … , N B . 19 Let ⋆ = ( r ⋆ ) denote the result of this interpolation. 4.…”
Section: Interpolation In the As On A Matrix Manifoldmentioning
confidence: 99%
See 1 more Smart Citation