2019
DOI: 10.3390/math7111129
|View full text |Cite
|
Sign up to set email alerts
|

Long Term Memory Assistance for Evolutionary Algorithms

Abstract: Short term memory that records the current population has been an inherent component of Evolutionary Algorithms (EAs). As hardware technologies advance currently, inexpensive memory with massive capacities could become a performance boost to EAs. This paper introduces a Long Term Memory Assistance (LTMA) that records the entire search history of an evolutionary process. With LTMA, individuals already visited (i.e., duplicate solutions) do not need to be re-evaluated, and thus, resources originally designated t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
6
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 19 publications
(6 citation statements)
references
References 63 publications
0
6
0
Order By: Relevance
“…During the evolutionary process, duplicate rules are especially likely to be generated when the rule trees are still small. (Long Term Memory Assistant for Evolutionary Algorithms) introduced in [52] is an approach to discover re-visited individuals and reuse already calculated fitness values. It can significantly improve the runtime behavior and should be integrated in next versions of Bat4CEP.…”
Section: Implementation Issuesmentioning
confidence: 99%
“…During the evolutionary process, duplicate rules are especially likely to be generated when the rule trees are still small. (Long Term Memory Assistant for Evolutionary Algorithms) introduced in [52] is an approach to discover re-visited individuals and reuse already calculated fitness values. It can significantly improve the runtime behavior and should be integrated in next versions of Bat4CEP.…”
Section: Implementation Issuesmentioning
confidence: 99%
“…The algorithm can be improved further using Long Term Memory Assistance (LTMA) [24], where duplicate solutions are identified. As such, time-consuming fitness evaluation is spared.…”
Section: Algorithmmentioning
confidence: 99%
“…However, to set the upper limit of the H(v) evaluations to be equal to the number of evaluations in the exhaustive search variant (Algorithm 5), we have used a simple memoization technique [27]. Memoization is often used in dynamic programming and recently also to improve the performance of metaheuristic algorithms [28]; however, it should be noted that it leads to an increase in memory usage. However, as pointed out in [28], it is hard to imagine this as an issue, given that the advances in hardware technology lead to massive capacities of inexpensive memory.…”
Section: : Input Dataset Y Initialize Setmentioning
confidence: 99%
“…Memoization is often used in dynamic programming and recently also to improve the performance of metaheuristic algorithms [28]; however, it should be noted that it leads to an increase in memory usage. However, as pointed out in [28], it is hard to imagine this as an issue, given that the advances in hardware technology lead to massive capacities of inexpensive memory. The memoized version of the coordinate-descent algorithm for optimum histogram binning v is given in Algorithm 7.…”
Section: : Input Dataset Y Initialize Setmentioning
confidence: 99%