2020
DOI: 10.1007/s10898-020-00917-9
|View full text |Cite
|
Sign up to set email alerts
|

Filter-based stochastic algorithm for global optimization

Abstract: We propose the general Filter-based Stochastic Algorithm (FbSA) for the global optimization of nonconvex and nonsmooth constrained problems. Under certain conditions on the probability distributions that generate the sample points, almost sure convergence is proved. In order to optimize problems with computationally expensive black-box objective functions, we develop the FbSA-RBF algorithm based on the general FbSA and assisted by Radial Basis Function (RBF) surrogate models to approximate the objective functi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 44 publications
0
3
0
Order By: Relevance
“…When problem (1) is nonconvex and large-scale, it may have many local minimizers and it is a challenge to find its global minimum. For problem (1), there are many popular global optimization methods, such as the multi-start methods [11,20,26,43,73,85], the branch-and-bound methods [10,17,76,83], the genetic evolution algorithms [32,46,59,63] and their memetic algorithms [35,60,78,84].…”
Section: Introductionmentioning
confidence: 99%
“…When problem (1) is nonconvex and large-scale, it may have many local minimizers and it is a challenge to find its global minimum. For problem (1), there are many popular global optimization methods, such as the multi-start methods [11,20,26,43,73,85], the branch-and-bound methods [10,17,76,83], the genetic evolution algorithms [32,46,59,63] and their memetic algorithms [35,60,78,84].…”
Section: Introductionmentioning
confidence: 99%
“…When problem (1) is nonconvex and large-scale, it may have many local minimizers and it is a challenge to find its global minimum. For problem (1), there are many popular global optimization methods, such as the multi-start methods [11,20,26,43,73,85], the branch-and-bound methods [10,17,76,83], the genetic evolution algorithms [32,46,59,63] and their memetic algorithms [35,60,78,84].…”
Section: Introductionmentioning
confidence: 99%
“…When problem (1) is nonconvex and large-scale, it may have many local minimizers and it is challenge to find its global minimum. For problem (1), there are some popular global optimization methods, such as the multi-start methods [7,15,48,56], the genetic evolution algorithms [20,28,40,43] and their hybrid algorithms [22,41,51,55].…”
Section: Introductionmentioning
confidence: 99%