2019
DOI: 10.1155/2019/2540102
|View full text |Cite
|
Sign up to set email alerts
|

Deep‐Mining Backtracking Search Optimization Algorithm Guided by Collective Wisdom

Abstract: The backtracking search optimization algorithm (BSA) is a recently proposed evolutionary algorithm with simple structure and well global exploration capability, which has been widely used to solve optimization problems. However, the exploitation capability of the BSA is poor. This paper proposes a deep-mining backtracking search optimization algorithm guided by collective wisdom (MBSAgC) to improve its performance. The proposed algorithm develops two learning mechanisms, i.e., a novel topological opposition-ba… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 83 publications
(119 reference statements)
0
7
0
Order By: Relevance
“…Then, based on TOBL, the historical population is updated in terms of the distance of the best fitness population and the current population. Therefore, Equation (A3) in APPENDIX A.2 is adjusted as [37] F̂i,joldpopbadbreak={Fi,jopppop,if||FjoptpopFi,jpop||<||FjoptpopFi,jpop||Fi,jpop,otherwise$$\begin{equation} {\hat{F}}^{\rm old-pop}_{i,j}={\left\lbrace \begin{aligned} &F^{\mathrm{opp-pop}}_{i,j}, \quad {\rm if}\ \big\vert \big\vert F^{\mathrm{opt-pop}}_{j}-F^{\mathrm{pop}}_{i,j}\big\vert \big\vert &lt;\big\vert \big\vert F^{\mathrm{opt-pop}}_{j}\\ &-F^{\mathrm{pop}}_{i,j}\big\vert \big\vert F^{\rm pop}_{i,j},\ \ \ \ \ \quad {\rm otherwise} \end{aligned}\right.} \end{equation}$$where Fjoptpop$F^{\mathrm{opt-pop}}_{j}$ is the best population at the j th dimension.…”
Section: Improved Backtracking Search Optimization Algorithmmentioning
confidence: 99%
“…Then, based on TOBL, the historical population is updated in terms of the distance of the best fitness population and the current population. Therefore, Equation (A3) in APPENDIX A.2 is adjusted as [37] F̂i,joldpopbadbreak={Fi,jopppop,if||FjoptpopFi,jpop||<||FjoptpopFi,jpop||Fi,jpop,otherwise$$\begin{equation} {\hat{F}}^{\rm old-pop}_{i,j}={\left\lbrace \begin{aligned} &F^{\mathrm{opp-pop}}_{i,j}, \quad {\rm if}\ \big\vert \big\vert F^{\mathrm{opt-pop}}_{j}-F^{\mathrm{pop}}_{i,j}\big\vert \big\vert &lt;\big\vert \big\vert F^{\mathrm{opt-pop}}_{j}\\ &-F^{\mathrm{pop}}_{i,j}\big\vert \big\vert F^{\rm pop}_{i,j},\ \ \ \ \ \quad {\rm otherwise} \end{aligned}\right.} \end{equation}$$where Fjoptpop$F^{\mathrm{opt-pop}}_{j}$ is the best population at the j th dimension.…”
Section: Improved Backtracking Search Optimization Algorithmmentioning
confidence: 99%
“…e initial lp value is 0.5 and needs to be updated after each iteration. In this paper, lp is designed to control the execution frequency of the search operators in equations (22) and (23) and those in equations (24) and (25). In the process of algorithm evolution, when lp is less than a random number, operator 1 and operator 2 are used to mutate; otherwise, operator 3 and operator 4 are selected to mutate.…”
Section: Proposed Learning Parameter Mechanismmentioning
confidence: 99%
“…e description of these problems including three-bar truss, pressure vessel, speed reducer, gear train, cantilever beam, and I-beam is given in Appendix A. Table 6 shows the parameter settings of six engineering problems according reference [23,24], where N, T, and D represent the population size, the maximum number of iterations, and the dimension of optimization problems, respectively. However, there are a number of complex constraints in each problem, which make it challenging to find the optimal solution satisfying all the constraints.…”
Section: Engineering Design Problemsmentioning
confidence: 99%
See 1 more Smart Citation
“…The G-PEAe is a population-based stochastic search method, which predicts the evolutionary direction at the macroscopic level to lead the population gradually move toward the global optimum. The main difference between GPEAe and other evolutionary algorithms (EAs) [7]- [11] is that GPEAe adopts the prediction theory to obtain the offsprings. Unlike traditional optimization techniques, GPEAe does not rely on gradient information and has the ability in jumping out of local optima.…”
Section: Introductionmentioning
confidence: 99%