Handbook of Heuristics 2018
DOI: 10.1007/978-3-319-07124-4_29
|View full text |Cite
|
Sign up to set email alerts
|

Memetic Algorithms

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
12
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(12 citation statements)
references
References 230 publications
0
12
0
Order By: Relevance
“…Memetic algorithms are flexible metaheuristic approaches (Cotta et al [2018]). This means that they are higher-level procedures, and can be applied to solve a variety of problems.…”
Section: State-of-the-art Of Memetic Algorithmsmentioning
confidence: 99%
“…Memetic algorithms are flexible metaheuristic approaches (Cotta et al [2018]). This means that they are higher-level procedures, and can be applied to solve a variety of problems.…”
Section: State-of-the-art Of Memetic Algorithmsmentioning
confidence: 99%
“…MAs, first proposed by Moscato [14,22], represent a recent growing area of research in evolutionary computation. The MA meta-heuristics essentially combine an EA with local search techniques that can be seen as a learning procedure that makes individuals capable of performing local refinements.…”
Section: Memetic Algorithmsmentioning
confidence: 99%
“…Population-based evolutionary approaches, such as genetic algorithms (GAs) or particle swarm optimization (PSO) [12], are able to detect in a fast way the main regions of attraction, while their local search abilities represent a major drawback [13] from the point of view of solution accuracy and of the convergence behaviour, especially when applied to multimodal problems. MAs [1,14] have recently received attention as effective meta-heuristics to improve generic EA schemes by combining these latter with local search (LS) procedures [13,15]. In MAs, EA operations are employed for global rough exploration, and LS operators are used to execute further exploitation of single EA individuals.…”
Section: Introductionmentioning
confidence: 99%
“…For instance, conventional evolutionary algorithms (EAs) can only estimate the optimum search space area within a cost‐effective time and present great problems in fine‐tuning solutions [2832]. These disadvantages can be overwhelmed by applying exploitative search to optimise the final population of solutions estimated by the EAs [33, 34]. Both effectiveness and solutions' quality are enhanced using this two‐step approach.…”
Section: Introductionmentioning
confidence: 99%
“…Both effectiveness and solutions' quality are enhanced using this two‐step approach. This big family of optimisation methods, which have elements from metaheuristic and EAs, and also include local learning or improvement procedures are generally known as MAs [34]. These have a number of advantages, such as simple implementation and capability to deal with different functional problem representations [33, 34].…”
Section: Introductionmentioning
confidence: 99%