2022
DOI: 10.15587/1729-4061.2022.254023
|View full text |Cite
|
Sign up to set email alerts
|

Metaheuristic optimization algorithm based on the two-step Adams-Bashforth method in training multi-layer perceptrons

Abstract: The proposed metaheuristic optimization algorithm based on the two-step Adams-Bashforth scheme (MOABT) was first used in this paper for Multilayer Perceptron Training (MLP). In computer science and mathematical examples, metaheuristic is high-level procedures or guidelines designed to find, devise, or select algorithmic research methods to obtain high-quality solutions to an example problem, especially if the information is insufficient or incomplete, or if computational capacity is limited. Many metaheuristic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 26 publications
0
2
0
Order By: Relevance
“…To begin, we'll go over the zero-order Takagi-Sugeno inference system. We're concerned with a hazy rule set, which is defined as follows: [20], [21]: Rule : IF is and is and … and is THEN is , (1) where is the number of fuzzy rules, is a real number, is a fuzzy subset of , and is a Gaussian membership function of the fuzzy judgment ‗‗ ‖ defined by…”
Section: Takagi-sugeno Inference System With Zero-ordermentioning
confidence: 99%
“…To begin, we'll go over the zero-order Takagi-Sugeno inference system. We're concerned with a hazy rule set, which is defined as follows: [20], [21]: Rule : IF is and is and … and is THEN is , (1) where is the number of fuzzy rules, is a real number, is a fuzzy subset of , and is a Gaussian membership function of the fuzzy judgment ‗‗ ‖ defined by…”
Section: Takagi-sugeno Inference System With Zero-ordermentioning
confidence: 99%
“…where 𝐹(𝜒) is a function that can be differentiable at least once for see more [1]- [3]. Problem (1) may be solved with the use of conjugate gradient (CG) algorithms, which are based on the iterative connection shown in:…”
Section: Literature Reviewmentioning
confidence: 99%