2017
DOI: 10.1016/j.jcp.2017.07.010
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic level-set method for shape optimisation

Abstract: We present a new method for stochastic shape optimisation of engineering structures. The method generalises an existing deterministic scheme, in which the structure is represented and evolved by a level-set method coupled with mathematical programming. The stochastic element of the algorithm is built on the methods of statistical mechanics and is designed so that the system explores a Boltzmann-Gibbs distribution of structures. In non-convex optimisation problems, the deterministic algorithm can get trapped in… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
8

Relationship

2
6

Authors

Journals

citations
Cited by 19 publications
(5 citation statements)
references
References 27 publications
0
5
0
Order By: Relevance
“…This section briefly summarizes the level set topology optimization method used in this study. More details of the method can be found in Hedges et al (2017) and Picelli et al (2018).…”
Section: Level Set Topology Optimization Methodsmentioning
confidence: 99%
“…This section briefly summarizes the level set topology optimization method used in this study. More details of the method can be found in Hedges et al (2017) and Picelli et al (2018).…”
Section: Level Set Topology Optimization Methodsmentioning
confidence: 99%
“…This is also advantageous when handling a high number of constraints, as shown in Dunning et al (2016). More details of this optimization formulation can be found in Picelli et al (2018a) and Hedges et al (2017).…”
Section: Linearized Sub-problemmentioning
confidence: 99%
“…The idea of those is to use partial data in each iteration of the optimization loop in the most efficient way possible. We mention that related techniques have become popular in many other applications, including the stochastic gradient descent (SGD) method in large scale machine learning applications [57,58,59,60]. In some of those works it is reported that the Kaczmarz approach might also be efficient in avoiding to being trapped in some local minima.…”
Section: A Nonlinear Kaczmarz Scheme For Data Processingmentioning
confidence: 99%