2023
DOI: 10.1109/access.2023.3258970
|View full text |Cite
|
Sign up to set email alerts
|

An Improved Future Search Algorithm Based on the Sine Cosine Algorithm for Function Optimization Problems

Abstract: Future search algorithm imitates the person living life. If one person finds that his life is not good, he will try to change his living life, and he will imitate a more successful person. To overcome insufficient performances of the basic Future search algorithm, this paper proposed an improved Future search algorithm based on the sine cosine algorithm (FSASCA). The proposed algorithm uses sine cosine algorithm to loop-progressive find the best solution. The searching method of the sine cosine algorithm can m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 50 publications
0
1
0
Order By: Relevance
“…In this work, the FSA optimizes the hyperparameter values in the ResNeXt approach by simulating optimal lifestyles for individuals, mapping this concept to the selection process of ResNeXt's hyperparameters. Employing a mathematical formula, the FSA enhances the initial random parameters through a global search among successful individuals and a local search within the population [22]. The FSA can be generated dependent upon mathematical formulas and begins phases depending on random solutions:…”
Section: Hyperparameter Tuningmentioning
confidence: 99%
“…In this work, the FSA optimizes the hyperparameter values in the ResNeXt approach by simulating optimal lifestyles for individuals, mapping this concept to the selection process of ResNeXt's hyperparameters. Employing a mathematical formula, the FSA enhances the initial random parameters through a global search among successful individuals and a local search within the population [22]. The FSA can be generated dependent upon mathematical formulas and begins phases depending on random solutions:…”
Section: Hyperparameter Tuningmentioning
confidence: 99%
“…In the real world, optimization problems are often highly intricate and exhibit nonlinear characteristics [ 12 ], which frequently involve multiple local optima within the objective function. Consequently, deterministic methods often encounter difficulties in escaping local minima when dealing with complex optimization problems [ 13 , 14 ]. Instead, metaheuristics are inspired by phenomena observed in nature and simulate these phenomena to efficiently optimize and solve problems without relying on complex gradient information and mathematical principles thereby better exploring optimal solutions [ 15 , 16 , 17 ].…”
Section: Introductionmentioning
confidence: 99%