2016
DOI: 10.1007/s10489-016-0825-8
|View full text |Cite
|
Sign up to set email alerts
|

Multi-objective ant lion optimizer: a multi-objective optimization algorithm for solving engineering problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
186
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 576 publications
(212 citation statements)
references
References 42 publications
0
186
0
Order By: Relevance
“…where cumsum is cumulative sum, t is the random walk step, n is the number of iterations and r (t) is a stochastic function [35]. The random walk is normalized as follows:…”
Section: Ant Lion Optimizermentioning
confidence: 99%
See 1 more Smart Citation
“…where cumsum is cumulative sum, t is the random walk step, n is the number of iterations and r (t) is a stochastic function [35]. The random walk is normalized as follows:…”
Section: Ant Lion Optimizermentioning
confidence: 99%
“…The KH is used for solving single-objective optimization problem while ALO is used to solve multi-objective one. ALO is selected for solving the multi-objective problem due to high convergence and coverage in obtaining Pareto optimal solution [35].…”
Section: Ant Lion Optimizermentioning
confidence: 99%
“…It has two important characteristics: group search strategy and information exchange among individuals. Multi objective evolutionary algorithm compared with the traditional method, its advantage lies in: in the implementation of a algorithm to obtain a set of optimal solutions, to avoid most of the traditional methods need to run multiple times and bring huge time overhead; the algorithm is not affected by the application of Pareto optimal front shape; evolutionary algorithm used to evaluate the objective function of the individual advantages and disadvantages of different problem solving, you only need to design the corresponding fitness function, without modifying the other part of the algorithm has good universality [9,10]. At present, many efficient multi-objective evolutionary algorithms have been widely used in engineering practice, production and life.…”
Section: Fitness Evaluation and Selection Strategy In Multi-objectivementioning
confidence: 99%
“…It is worth mentioning that there are many methods to improve or optimize the weights of the neural networks, including Genetic Algorithms (GA), Particle Swarm Optimization (PSO), etc. [37][38][39]. In this paper, we choose Bayesian regularization mainly because the data in our study are scarce, and the Bayesian method can improve the performance of neural networks (by reducing the training iterations) [36].…”
Section: Training Parameter Selection On the Bp Neural Networkmentioning
confidence: 99%