2002
DOI: 10.1016/s0377-2217(01)00104-7
|View full text |Cite
|
Sign up to set email alerts
|

Genetic local search for multi-objective combinatorial optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
247
0
6

Year Published

2002
2002
2016
2016

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 442 publications
(253 citation statements)
references
References 17 publications
0
247
0
6
Order By: Relevance
“…The distance between two parents is measured in the decision space or the objective space. The necessity of mating restriction in EMO algorithms was also stressed by Jaszkiewicz [13] and Watanabe et al [18]. On the other hand, Zitzler & Thiele [20] reported that no improvement was achieved by mating restriction in their computational experiments.…”
Section: Introductionmentioning
confidence: 96%
See 1 more Smart Citation
“…The distance between two parents is measured in the decision space or the objective space. The necessity of mating restriction in EMO algorithms was also stressed by Jaszkiewicz [13] and Watanabe et al [18]. On the other hand, Zitzler & Thiele [20] reported that no improvement was achieved by mating restriction in their computational experiments.…”
Section: Introductionmentioning
confidence: 96%
“…Recent EMO algorithms usually share some common ideas such as elitism, fitness sharing and Pareto ranking for improving both the diversity of solutions and the convergence speed to the Pareto-front (e.g., see Coello et al [1] and Deb [3]). In some studies, local search was combined with EMO algorithms for further improving the convergence speed to the Pareto-front [10,[12][13][14]. While mating restriction has been often discussed in the literature, its effect has not been clearly demonstrated.…”
Section: Introductionmentioning
confidence: 99%
“…The traditional GA usually consists of some operators such as crossover, selection and mutation which are simulated from biological and genetic processes. However, the traditional GA is inefficient for solving large optimization problems [2].…”
Section: Introductionmentioning
confidence: 99%
“…An early example is VEGA [58]; other examples include the algorithms proposed by Ishibuchi and Murata [38] and MOGLS of Jaszkiewicz [39]. Also ACO algorithms frequently use some form of scalarized aggregation, for example, for combining pheromone (or heuristic) information specific to each objective [5,28,47].…”
Section: Scalarization-based Multi-objective Optimizationmentioning
confidence: 99%
“…The single-objective algorithm used to tackle the scalarized problems is an iterated local search algorithm, and the neighborhood operator is based on 2-Opt moves. This hybrid algorithm has been compared favorably to the best algorithm known at that time, the MOGLS algorithm from Jaszkiewicz [39]. A more in-depth experimental study of PD-TPLS and other TPLS variants has been presented by Paquete and Stützle [55].…”
Section: Hybrid Tpls+plsmentioning
confidence: 99%