2021
DOI: 10.34768/amcs-2021-0039
|View full text |Cite
|
Sign up to set email alerts
|

A modified particle swarm optimization procedure for triggering fuzzy flip-flop neural networks

Abstract: The aim of the presented study is to investigate the application of an optimization algorithm based on swarm intelligence to the configuration of a fuzzy flip-flop neural network. Research on solving this problem consists of the following stages. The first one is to analyze the impact of the basic internal parameters of the neural network and the particle swarm optimization (PSO) algorithm. Subsequently, some modifications to the PSO algorithm are investigated. Approximations of trigonometric functions are the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 26 publications
0
0
0
Order By: Relevance
“…PSO was applied to various neural networks to yield better results. Kowalski and Słoczyński (2021) modified PSO in terms of regularization control, geometric swarm centre determination, etc., and applied it to find an optimal configuration of a fuzzy flip-flop, producing the least training error. In the work of Carvalho and Ludermir (2007), the concept of PSO-PSO was developed, wherein inner PSO was used to optimize the weights of an MLP neural network while outer PSO was used to optimize its architecture.…”
Section: Literature Reviewmentioning
confidence: 99%
“…PSO was applied to various neural networks to yield better results. Kowalski and Słoczyński (2021) modified PSO in terms of regularization control, geometric swarm centre determination, etc., and applied it to find an optimal configuration of a fuzzy flip-flop, producing the least training error. In the work of Carvalho and Ludermir (2007), the concept of PSO-PSO was developed, wherein inner PSO was used to optimize the weights of an MLP neural network while outer PSO was used to optimize its architecture.…”
Section: Literature Reviewmentioning
confidence: 99%