2010 Sixth IEEE International Conference on E-Science Workshops 2010
DOI: 10.1109/esciencew.2010.25
|View full text |Cite
|
Sign up to set email alerts
|

Automated, Parallel Optimization of Stochastic Functions Using a Modified Simplex Algorithm

Abstract: This paper proposes a framework and new parallel algorithm for optimization of stochastic functions based on a downhill simplex algorithm. The function to be optimized is assumed to be subject to random noise, the variance of which decreases with sampling time; this is the situation expected for many real-world and simulation applications where results are obtained from sampling, and contain experimental error or random noise. The proposed optimization method is found to be comparable to previous stochastic op… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
1
0

Year Published

2011
2011
2023
2023

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 12 publications
0
1
0
Order By: Relevance
“…The principle of the algorithm is based on the observation that the solution that maximizes the objective function MW d (u) is located at some extreme point or vertex of the feasible region. The algorithm proceeds to a sequential search of the vertices, delimiting the feasible region until further correction of the value of the objective function is no longer possible [16,17].…”
mentioning
confidence: 99%
“…The principle of the algorithm is based on the observation that the solution that maximizes the objective function MW d (u) is located at some extreme point or vertex of the feasible region. The algorithm proceeds to a sequential search of the vertices, delimiting the feasible region until further correction of the value of the objective function is no longer possible [16,17].…”
mentioning
confidence: 99%