2018 IEEE Conference on Decision and Control (CDC) 2018
DOI: 10.1109/cdc.2018.8619219
|View full text |Cite
|
Sign up to set email alerts
|

An active subspace method for accelerating convergence in Delaunay-based optimization via dimension reduction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
6
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(6 citation statements)
references
References 14 publications
0
6
0
Order By: Relevance
“…This algorithm is called optimization by moving ridge functions (OMoRF), as it leverages local ridge function approximations that move through the function domain. Although other optimization algorithms have used subspaces to reduce the problem dimension (Wang et al 2016;Zhao, Alimo, and Bewley 2018;Kozak et al 2019), OMoRF differs from these in some key aspects. First, it is completely derivative-free.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…This algorithm is called optimization by moving ridge functions (OMoRF), as it leverages local ridge function approximations that move through the function domain. Although other optimization algorithms have used subspaces to reduce the problem dimension (Wang et al 2016;Zhao, Alimo, and Bewley 2018;Kozak et al 2019), OMoRF differs from these in some key aspects. First, it is completely derivative-free.…”
Section: Introductionmentioning
confidence: 99%
“…Although the VRSSD algorithm does not require full gradient calculations, it still requires directional derivatives to be computed. Similarly, the Delaunay-based derivative-free optimization via global surrogates with active subspace method ( -DOGS with ASM) proposed by Zhao, Alimo, and Bewley (2018) requires an initial sample of gradient evaluations to determine the dimensionreducing subspace. Second, the subspaces computed by OMoRF correspond to the directions of strongest variability of the objective function.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…9 However, their use in optimisation has been rather limited. 4,10,11 Given the opportunities these ridge function approximations afford, a closer examination of ridge functions in the context of optimisation is necessary.…”
Section: Introductionmentioning
confidence: 99%
“…Although their research was focused on using ridge functions for preliminary design optimisation studies, they did not address the possibility of using such ridge approximations for greater refinement during optimisation. Zhao et al 10 demonstrated that such optimisation refinement studies were possible by using ridge function surrogates in a Delaunay-based optimisation via global surrogates (∆-DOGS) method. This approach sought to approximately solve the bound-constrained optimisation problem…”
Section: Introductionmentioning
confidence: 99%