2019
DOI: 10.1137/18m1168571
|View full text |Cite
|
Sign up to set email alerts
|

Dimension Reduction via Gaussian Ridge Functions

Abstract: Ridge functions have recently emerged as a powerful set of ideas for subspace-based dimension reduction. In this paper we begin by drawing parallels between ridge subspaces, sufficient dimension reduction and active subspaces, contrasting between techniques rooted in statistical regression and those rooted in approximation theory. This sets the stage for our new algorithm that approximates what we call a Gaussian ridge function-the posterior mean of a Gaussian process on a dimensionreducing subspace-suitable f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 22 publications
(15 citation statements)
references
References 35 publications
0
15
0
Order By: Relevance
“…It is worthy to mention that there has been some significant efforts to reduce the dimensionality in Bayesian optimization using active subspace [51,52]. However, the true difficulty lies in the approximation of the high-dimensional gradients in the classical GP approach, which is originally gradient-free, in order to obtain an optimal rotation matrix on the Stiefel [51,52] or Grassmann manifolds 5 [54,55] in the active subspace approach. This, unnecessarily, complicates the optimization problem (by assigning the additional task of discovering the active subspace, which is usually treated as a constrained manifold optimization problem) that is already challenging in high-dimensional space.…”
Section: High-dimensional Bayesian Optimization Via Random Embeddingsmentioning
confidence: 99%
“…It is worthy to mention that there has been some significant efforts to reduce the dimensionality in Bayesian optimization using active subspace [51,52]. However, the true difficulty lies in the approximation of the high-dimensional gradients in the classical GP approach, which is originally gradient-free, in order to obtain an optimal rotation matrix on the Stiefel [51,52] or Grassmann manifolds 5 [54,55] in the active subspace approach. This, unnecessarily, complicates the optimization problem (by assigning the additional task of discovering the active subspace, which is usually treated as a constrained manifold optimization problem) that is already challenging in high-dimensional space.…”
Section: High-dimensional Bayesian Optimization Via Random Embeddingsmentioning
confidence: 99%
“…Techniques for estimating U build on ideas from sufficient dimension reduction (6) and more recent works such active subspaces (5) and polynomial (8,9) and Gaussian (10) ridge approximations. While our work in this paper is invariant to the specific parameter-space dimension reduction technique utilised, we briefly detail a few ideas within ridge approximation.…”
Section: Techniques For Dimension Reductionmentioning
confidence: 99%
“…over the the space of matrix manifolds U and the coefficients (or hyperparameters) α associated with the parametric function g. This is a challenging optimisation problem and it is not convex. In Seshadri et al (10) the authors assume that g is the posterior mean of a Gaussian process (GP) and iteratively solve for the hyperparameters associated with the GP, whilst optimising U using a conjugate gradient optimiser on the Stiefel manifold (see Absil et al (11) ). In Constantine et al (9) the authors set g to be a polynomial and iteratively solve for its coefficients-using standard least squares regression-whilst optimising over the Grassman manifold to estimate the subspace U.…”
Section: Techniques For Dimension Reductionmentioning
confidence: 99%
“…Ridge function approximations can be used to reduce the effective dimension of a high-dimensional problem and have been constructed using a number of methods. 13,[35][36][37] One promising set of ideas has been rooted in active subspaces 13 -a subspace-based dimension reduction technique framework-which seeks to find the few linear combinations of the input space which best describe the variability of a qoi. The effective dimension of the problem is reduced by projecting the input parameters over this subspace, resulting in a new set of design variables with reduced dimension-the active variables.…”
Section: Iib Ridge Functionsmentioning
confidence: 99%