2007
DOI: 10.1002/fld.1584
|View full text |Cite
|
Sign up to set email alerts
|

Direct, adjoint and mixed approaches for the computation of Hessian in airfoil design problems

Abstract: SUMMARYIn this paper, four approaches to compute the Hessian matrix of an objective function used often in aerodynamic inverse design problems are presented. The computationally less expensive among them is selected and applied to the reconstruction of cascade airfoils that reproduce a prescribed pressure distribution over their walls, under inviscid and viscous flow considerations. The selected approach is based on the direct sensitivity analysis method for the computation of first derivatives, followed by th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
38
0
5

Year Published

2011
2011
2021
2021

Publication Types

Select...
6
1

Relationship

3
4

Authors

Journals

citations
Cited by 42 publications
(43 citation statements)
references
References 25 publications
0
38
0
5
Order By: Relevance
“…The idea of gradient smoothing using Sobolev gradients [23] has extended to more complex approximate Hessians via shape calculus and Fourier analysis [2,48]. With the advent of automatic differentiation (AD), exact second-order sensitivities [51,38,14] have been employed in ASO problems. However, the high cost of computing the exact Hessian has lead to truncated-Newton methods [35,19] that use conjugate-gradient or Newton-Krylov methods to approximate the search direction with matrix-vector products of the Hessian.…”
Section: The Hessianmentioning
confidence: 99%
See 3 more Smart Citations
“…The idea of gradient smoothing using Sobolev gradients [23] has extended to more complex approximate Hessians via shape calculus and Fourier analysis [2,48]. With the advent of automatic differentiation (AD), exact second-order sensitivities [51,38,14] have been employed in ASO problems. However, the high cost of computing the exact Hessian has lead to truncated-Newton methods [35,19] that use conjugate-gradient or Newton-Krylov methods to approximate the search direction with matrix-vector products of the Hessian.…”
Section: The Hessianmentioning
confidence: 99%
“…It has since been expanded to be used in different areas such as optimization, extrapolation, and uncertainty analysis. The work of Papadimitriou and Giannakoglou [38,37,36,15,39] is more closely tied to ASO and explores the use of an exactly-initialized BFGS algorithm to optimize two-dimensional aerodynamic shapes. Although the Hessian is only evaluated once in the exactly-initialized BFGS algorithm, the initial cost is still too large to be effective.…”
Section: The Hessianmentioning
confidence: 99%
See 2 more Smart Citations
“…It can be calculated using finite differences, the direct method and the adjoint method [10,11]. These techniques will be described in detail later in this work.…”
Section: Introductionmentioning
confidence: 99%