2021
DOI: 10.1109/access.2021.3073480
|View full text |Cite
|
Sign up to set email alerts
|

Improved Initialization Method for Metaheuristic Algorithms: A Novel Search Space View

Abstract: As an essential step of metaheuristic optimizers, initialization seriously affects the convergence speed and solution accuracy. The main motivation of the state-of-the-art initialization method is to generate a small initial population to cover the search space as much as possible uniformly. However, these approaches have suffered from the curse of dimensionality, high computational cost, and sensitivity to parameters, which ultimately reduce the algorithm's convergence speed. In this paper, a new initializati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 18 publications
(5 citation statements)
references
References 70 publications
0
5
0
Order By: Relevance
“…The techniques like diagonal linear uniform initialization can be used to reduce the computational cost (Li et al, 2021).…”
Section: High Computational Complexitymentioning
confidence: 99%
See 1 more Smart Citation
“…The techniques like diagonal linear uniform initialization can be used to reduce the computational cost (Li et al, 2021).…”
Section: High Computational Complexitymentioning
confidence: 99%
“…The process of population initialization also plays an important role in computational complexity, as selecting a large population may increase the complexity of the soft computing technique in predicting software maintainability. The techniques like diagonal linear uniform initialization can be used to reduce the computational cost (Li et al, 2021).…”
Section: Challenges and Potential Solutionsmentioning
confidence: 99%
“…In practice, however, uniform distributions may not be suitable for all applications. There are also some other commonly used initialization techniques, such as chaotic initialization, sequencebased deterministic initialization, opposition-based learning, and Latin hypercube sampling [128].…”
Section: Task 1 Initializationmentioning
confidence: 99%
“…For a recent approach to restrict the sampling to a subspace, see, for example, Li et al. (2021a). Watson (2010) asserted that the computational cost of sampling should be significantly lower than the cost of solving the problem with randomly generated individuals.…”
Section: Single‐solution Metaheuristicsmentioning
confidence: 99%
“…Additionally, we refer to Watson (2010) who pointed out that sampling the search space should attempt to provide as wide a coverage of the search space as possible within the limits of an acceptable computational cost. For a recent approach to restrict the sampling to a subspace, see, for example, Li et al (2021a). Watson (2010) asserted that the computational cost of sampling should be significantly lower than the cost of solving the problem with randomly generated individuals.…”
Section: Measuring Modification Costsmentioning
confidence: 99%