2018
DOI: 10.1137/17m1117690
|View full text |Cite
|
Sign up to set email alerts
|

Data-Driven Polynomial Ridge Approximation Using Variable Projection

Abstract: Inexpensive surrogates are useful for reducing the cost of science and engineering studies involving large-scale, complex computational models with many input parameters. A ridge approximation is one class of surrogate that models a quantity of interest as a nonlinear function of a few linear combinations of the input parameters. When used in parameter studies (e.g., optimization or uncertainty quantification), ridge approximations allow the low-dimensional structure to be exploited, reducing the effective dim… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
38
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 43 publications
(38 citation statements)
references
References 52 publications
0
38
0
Order By: Relevance
“…Active subspaces can be seen in the more general context of ridge approximation (see [35,26]). In particular it can be proved that, under certain conditions, the active subspace is nearly stationary and it is a good starting point in optimal ridge approximation as shown in [12,23].…”
Section: Parameter Space Reduction By Active Subspacesmentioning
confidence: 99%
“…Active subspaces can be seen in the more general context of ridge approximation (see [35,26]). In particular it can be proved that, under certain conditions, the active subspace is nearly stationary and it is a good starting point in optimal ridge approximation as shown in [12,23].…”
Section: Parameter Space Reduction By Active Subspacesmentioning
confidence: 99%
“…Note that (14) corresponds to a total order index set, where the sum of the composite univariate polynomial orders in the multivariate expansion is less than or equal to k. Such a basis was used in Ref. (25), resulting in a total of 351 coefficients (setting d = 25 and k = 2 in (14)) that needed to be estimated. Once obtained, computing gradients of this global quadratic is trivial, resulting in the following estimate of the covariance matrix…”
Section: Computing Active Subspaces Via a Quadratic Modelmentioning
confidence: 99%
“…Motivated by the variable projection aided dimension reduction approach in Ref. (14), we devote Section 4 to its study, offering a few minor algorithmic variations. Then, we test this on one of our blades, being parsimonious in the number of samples (and hence CFD evaluations) used.…”
Section: Introductionmentioning
confidence: 99%
“…Note that the ridge recovery problem is distinct from ridge approximation (Constantine et al, 2017b;Hokanson and Constantine, 2017), where the goal is to find A and construct g that minimize the approximation error for a given f . 1 Inverse regression may be useful for ridge approximation or identifying near-ridge structure in a given function, but pursuing these ideas is beyond the scope of this manuscript.…”
Section: Ridge Functionsmentioning
confidence: 99%