2015
DOI: 10.1115/1.4028755
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Objective Robust Optimization Using a Postoptimality Sensitivity Analysis Technique: Application to a Wind Turbine Design

Abstract: Toward a multi-objective optimization robust problem, the variations in design variables and design environment parameters include the small variations and the large variations. The former have small effect on the performance functions and/or the constraints, and the latter refer to the ones that have large effect on the performance functions and/or the constraints. The robustness of performance functions is discussed in this paper. A post-optimality sensitivity analysis technique for multi-objective robust op… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 12 publications
(9 citation statements)
references
References 74 publications
0
9
0
Order By: Relevance
“…4. The Pareto-optimal solutions lie on a boundary in the Perfor- mance Function Space (PF-Space) between the two objective functions called Pareto front [17] shown in Fig. 5.…”
Section: Results Analysismentioning
confidence: 99%
See 2 more Smart Citations
“…4. The Pareto-optimal solutions lie on a boundary in the Perfor- mance Function Space (PF-Space) between the two objective functions called Pareto front [17] shown in Fig. 5.…”
Section: Results Analysismentioning
confidence: 99%
“…By substituting Eqns. (17) and (18) separately in Eqn. (13), a single SE constraint at the contact point C r j will form two hyperplanes in the wrench space expressed as:…”
Section: Available Wrench Setmentioning
confidence: 99%
See 1 more Smart Citation
“…A Pareto Front (PF) is achieved through this compromise. Currently, multi-objective optimization of wind turbines is all achieved by evolution algorithms, including the hierarchical genetic algorithm [5], Pareto archived evolution strategy [6], strength Pareto evolutionary algorithm 2 [7,8], multi-objective genetic algorithm [9], non-dominated sorting genetic algorithm-II (NSGA-II) [10][11][12][13] and particle swarm optimization (PSO) [14,15]. These algorithms are categorized as gradient-free algorithms (GFAs) [16].…”
Section: Introductionmentioning
confidence: 99%
“…GFA diversity is maintained by clustering operators [5][6][7] and crowding distance [9][10][11][12][13], as well as their variants [14,22], and assigning a virtual value to each individual in order to provide clues for the approximated value of the density of the adjacent solutions. As the distances from a specific solution to adjacent solutions increase, this value increases, and therefore results in an increasing probability that the specific solution is selected.…”
Section: Introductionmentioning
confidence: 99%