Proceedings of the 13th Annual Conference on Genetic and Evolutionary Computation 2011
DOI: 10.1145/2001576.2001695
|View full text |Cite
|
Sign up to set email alerts
|

Local-meta-model CMA-ES for partially separable functions

Abstract: In this paper, we propose a new variant of the covariance matrix adaptation evolution strategy with local meta-models (lmm-CMA) for optimizing partially separable functions. We propose to exploit partial separability by building at each iteration a meta-model for each element function (or sub-function) using a full quadratic local model. After introducing the approach we present some first experiments using element functions with dimensions 2 and 4. Our results demonstrate that, as expected, exploiting partial… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
5
0

Year Published

2011
2011
2022
2022

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 15 publications
(6 citation statements)
references
References 16 publications
1
5
0
Order By: Relevance
“…The GA leads to well configurations dispersed over a large zone. The maximum value of NPV obtained by the GA is equal to $1.86 × 10 10 , and it corresponds to a well configuration close to a well configuration obtained by CMA-ES with an NPV $2.05 × 10 10 .…”
Section: Well Placement Performancesupporting
confidence: 61%
See 2 more Smart Citations
“…The GA leads to well configurations dispersed over a large zone. The maximum value of NPV obtained by the GA is equal to $1.86 × 10 10 , and it corresponds to a well configuration close to a well configuration obtained by CMA-ES with an NPV $2.05 × 10 10 .…”
Section: Well Placement Performancesupporting
confidence: 61%
“…We believe, however, that performance over CMA-ES could be more significantly improved by exploiting, within the algorithm, knowledge, and relevant information about the optimization problem at hand, such as the problem structure. Some first steps in that direction have been conduced in [10,11] where the fact that the objective function can be split into local components referring to each of the wells where each depends on a smaller number of parameters (i.e., partial separability of the objective function) is exploited. Another approach could be to exploit some a priori information such as well allocation factors and connectivity using the work developed in [15].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The Rosenbrock and Schwefels 1.2 functions are well-known, classical benchmark functions that have been widely used to evaluate EA's [22]. The Block-rotated Ellipsoid [1] is a partially separable function designed to only have dependencies between z i and , and for distributed functions, P is a randomly generated ordering such as P = (6, 1, 3, ...). The 2 × 2 rotation matrix R i is uniformly generated according to the method of [15] and α = 1e+6.…”
Section: Exponential Crossover On Adjacent/distributed Functionsmentioning
confidence: 99%
“…This is actually a specific consequence of another possibility of using information from CMA-ES for the GPreplacing the original space of d-dimensional vectors, R d , by the space of their principal components with respect to C (g) . In this context, it is worth recalling that in [2,3,15], Mahalanobis instead of Euclidean distance was used when combining CMA-ES with quadratic response surface models in [2,3,15]. In the approach presented there, the space of the principal components with respect to C (g) is used, together with an estimate of density, to locally weight the model predictions with respect to the considered input.…”
Section: Using Information From Cma-es For the Gpmentioning
confidence: 99%