2022
DOI: 10.1016/j.cma.2021.114202
|View full text |Cite
|
Sign up to set email alerts
|

An adaptive method of moving asymptotes for topology optimization based on the trust region

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 11 publications
(2 citation statements)
references
References 27 publications
0
2
0
Order By: Relevance
“…Although the GTO methods (e.g., the MMA-based framework) exhibit higher computational efficiency than the NGTO methods, they more depend on the selected initial designs with existing BGs. Very recently, Hu et al [141] proposed a novel gradient-based optimizer, i.e., the Trust Region based Moving Asymptotes (TRMA), which showed stronger robustness than the frequently used MMA in solving some benchmark topology optimization problems (like stress-constrained problems).…”
Section: Acoustic Wavesmentioning
confidence: 99%
“…Although the GTO methods (e.g., the MMA-based framework) exhibit higher computational efficiency than the NGTO methods, they more depend on the selected initial designs with existing BGs. Very recently, Hu et al [141] proposed a novel gradient-based optimizer, i.e., the Trust Region based Moving Asymptotes (TRMA), which showed stronger robustness than the frequently used MMA in solving some benchmark topology optimization problems (like stress-constrained problems).…”
Section: Acoustic Wavesmentioning
confidence: 99%
“…This iterative inner solve, which can be significantly expensive, determines the update of the design variables. Several MMA-based approaches have been developed, such as the GCMMA [111], which guarantees the convergence of the algorithm, the TRMMA [60], which presents improved Chapter 4. Toward Large-Scale GPU-based Stress-Constrained Topology Optimization 44 convergence, and the EMMA [21], which incorporate second order information, but none of them address the high computational cost.…”
Section: Optimization Algorithmmentioning
confidence: 99%