1998
DOI: 10.1002/(sici)1099-1360(199801)7:1<34::aid-mcda161>3.0.co;2-6
|View full text |Cite
|
Sign up to set email alerts
|

Pareto simulated annealing—a metaheuristic technique for multiple‐objective combinatorial optimization

Abstract: This paper presents a multiple-objective metaheuristic procedureÐPareto simulated annealing. The goal of the procedure is to find in a relatively short time a good approximation of the set of efficient solutions of a multipleobjective combinatorial optimization problem. The procedure uses a sample, of so-called generating solutions. Each solution explores its neighbourhood in a way similar to that of classical simulated annealing. Weights of the objectives, used for their local aggregation, are tuned in each i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
258
0
9

Year Published

1999
1999
2024
2024

Publication Types

Select...
4
4
2

Relationship

0
10

Authors

Journals

citations
Cited by 665 publications
(276 citation statements)
references
References 14 publications
0
258
0
9
Order By: Relevance
“…As one representative of the SAC search model, we examine the straight-forward approach that solves several scalarizations of the objective function vector and tackles each of these problems with an underlying algorithm for the corresponding single objective version, an approach which underlies many earlier proposed algorithms [20,21,[26][27][28][29]. We follow the well-known principle of defining scalarizations of the objective function vector with respect to a weighted sum.…”
Section: The Search Model and Algorithmic Componentsmentioning
confidence: 99%
“…As one representative of the SAC search model, we examine the straight-forward approach that solves several scalarizations of the objective function vector and tackles each of these problems with an underlying algorithm for the corresponding single objective version, an approach which underlies many earlier proposed algorithms [20,21,[26][27][28][29]. We follow the well-known principle of defining scalarizations of the objective function vector with respect to a weighted sum.…”
Section: The Search Model and Algorithmic Componentsmentioning
confidence: 99%
“…This performance measure was used in Czyzak & Jaszkiewicz [2] and referred to as R D1 in Knowles & Corne [15]. The R D1 measure needs all Pareto-optimal solutions of each test problem.…”
Section: Performance Measuresmentioning
confidence: 99%
“…In order to guarantee that a dispersed set of efficient solutions is obtained, several techniques have been proposed; these can be classified into one of two main groups: Methods that aggregate the objective functions and dynamically modify the search direction during the search process or within several runs [3,6,7,9,10,19,23], and methods that are based on a Paretolike dominance relation as an acceptance criterion in the search to distinguish between candidate solutions [24,13]; this second type of methods avoid the aggregation of objectives. Recently, empirical evidence was gathered suggesting that methods belonging to the first group perform particularly well [10], the main reason being that local search algorithms can deal more easily with aggregated objective functions.…”
Section: Introductionmentioning
confidence: 99%