6th Symposium on Multidisciplinary Analysis and Optimization 1996
DOI: 10.2514/6.1996-4019
|View full text |Cite
|
Sign up to set email alerts
|

Concurrent Subspace Optimization using gradient-enhanced neural network approximations

Abstract: Design space approximations have proven useful as a means of coordinating individual discipline design decisions in the multidisciplinary design of complex, coupled systems. Arti cial neural networks have been used to provide these parameterized response surface approximations. A method has been developed in which neural networks can be trained using both state and state sensitivity information. This allows for more compact network geometries and reduces the number of coupled system analyses required to develo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

1997
1997
2020
2020

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 25 publications
(12 citation statements)
references
References 9 publications
0
12
0
Order By: Relevance
“…Mathematical aspects of the algorithm are validated by showing that the Karush-Kuhn-Tucker (K-K-T) condition is satisfied when a final solution is found [15][16][17]. Numerical performance is verified through obvious mathematical problems and the results are compared with those from other MDO algorithms.…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations
“…Mathematical aspects of the algorithm are validated by showing that the Karush-Kuhn-Tucker (K-K-T) condition is satisfied when a final solution is found [15][16][17]. Numerical performance is verified through obvious mathematical problems and the results are compared with those from other MDO algorithms.…”
Section: Introductionmentioning
confidence: 99%
“…The problem is solved by many algorithms in References [3,15]. It is decomposed into two subspaces: one with x 1 and x 2 , and the other with x 3 as design variables.…”
Section: Mathematical Examplementioning
confidence: 99%
See 2 more Smart Citations
“…For example, CO may encounter convergence problem if the formulation is degenerate (DeMiguel and Murray 2006), and BLISS may entail lots of iterative cycles to converge if approximation models are inaccurate or initial bounds on design variables are not properly defined (Zhao and Cui 2011). The standard CSSO is also shown to be computationally inefficient as too many function calls are needed to converge (Yi et al 2008), but the use of approximation surrogates can bring a 1-2 order of magnitude reduction in the number of system analyses compared to AAO (Sellar and Batill 1996;Sellar et al 1996b;Simpson et al 2004). Thus the performances of MDO procedures are problem and implementation dependent, and the selection of a proper optimization procedure for a specific problem is more or less in an ad hoc manner.…”
Section: Introductionmentioning
confidence: 99%