Proceedings of the Companion Publication of the 2015 Annual Conference on Genetic and Evolutionary Computation 2015
DOI: 10.1145/2739482.2764691
|View full text |Cite
|
Sign up to set email alerts
|

Denoising Autoencoders for Fast Combinatorial Black Box Optimization

Abstract: We integrate a Denoising Autoencoder (DAE) into an Estimation of Distribution Algorithm (EDA) and evaluate the performance of DAE-EDA on several combinatorial optimization problems. We asses the number of fitness evaluations and the required CPU times. Compared to the state-of-the-art Bayesian Optimization Algorithm (BOA), DAE-EDA needs more fitness evaluations, but is considerably faster, sometimes by orders of magnitude. These results show that DAEs can be useful tools for problems with low but non-negligibl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
2
2

Relationship

2
6

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 13 publications
0
6
0
Order By: Relevance
“…However, other neural network based EDAs such as DAE-EDA are computationally less expensive. Hence, they need considerably less time for solving the same benchmark instances to optimality ( [18,19,20]).…”
Section: Methodsmentioning
confidence: 99%
“…However, other neural network based EDAs such as DAE-EDA are computationally less expensive. Hence, they need considerably less time for solving the same benchmark instances to optimality ( [18,19,20]).…”
Section: Methodsmentioning
confidence: 99%
“…On top of that, the NN can be applied in multiple ways. For instance, autoencoders [7,167], a particular class of NNs, have been used to sample new solutions in a way that resembles sampling in EDAs [121], but also to adaptively learn a "non-linear mutation operator" that attempts to widen the basins of attraction around the best individuals. In terms of the EDA performance, the potential of the neural model to capture intricate relationships among the variables is as relevant as the specific way it is utilized within the evolutionary algorithm.…”
Section: Neural and Deep Learning Models In Estimation Of Distributio...mentioning
confidence: 99%
“…The neural models that have been tried for EDAs, exhibit a variety of behaviors: autoencoders used in a traditional EDA scheme in [121] are extremely fast when compared with methods that learn Bayesian networks but they fail to achieve the same efficiency than BOA in terms of function evaluations. When used as a mutation distribution in [26] GA-dA (see Table 9) outperforms BOA in some problems (notably on the knapsack problem) but it is outperformed on the hierarchical HIFF function.…”
Section: Neural and Deep Learning Models In Estimation Of Distributio...mentioning
confidence: 99%
See 1 more Smart Citation
“…DAE-EDA uses a Denoising Autoencoder (DAE) as its probabilistic model [19,11,12]. The DAE is a feed-forward neural network, which is trained using the backpropagation algorithm.…”
Section: Algorithms For Comparisonmentioning
confidence: 99%