2019
DOI: 10.1007/978-3-030-12127-3_11
|View full text |Cite
|
Sign up to set email alerts
|

Salp Swarm Algorithm: Theory, Literature Review, and Application in Extreme Learning Machines

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
55
0
1

Year Published

2019
2019
2022
2022

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 106 publications
(56 citation statements)
references
References 60 publications
0
55
0
1
Order By: Relevance
“…T is the weight vector connecting the neuron in the i-th hidden layer with the output neuron [21]. When the quantity of neurons in the hidden layer is equal to that of samples in the training set, for randomly selected ω and b, the SLFNs with N hidden neurons and activation function g(x) can approximate the training samples with zero error, i.e., N i=1 o j − t j = 0 .…”
Section: Extreme Learning Machine and Salp Swarm Algorithm A Extmentioning
confidence: 99%
See 1 more Smart Citation
“…T is the weight vector connecting the neuron in the i-th hidden layer with the output neuron [21]. When the quantity of neurons in the hidden layer is equal to that of samples in the training set, for randomly selected ω and b, the SLFNs with N hidden neurons and activation function g(x) can approximate the training samples with zero error, i.e., N i=1 o j − t j = 0 .…”
Section: Extreme Learning Machine and Salp Swarm Algorithm A Extmentioning
confidence: 99%
“…where x 1 j stands for the position of the leader, which changes only according to the position of food source, F j is the position vector of food source in the j-th dimension, ub j and lb j are the upper and lower bounds of the j-th dimension respectively, c 2 and c 3 are random values in the interval [0,1], which are related to whether the leader's next position in the j-th dimension should be positive infinity or negative infinity and the step size. c 1 is the main parameter for balancing search and development in SSA, and its expression is [21]:…”
Section: B Salp Swarm Algorithmmentioning
confidence: 99%
“…SSA is logically demonstrated for the usage in the selection of parameters for HHOA. In this work, HHOA parameters were optimized using salp swarm optimization (SSA) 33 that improves the convergence rate and minimizes the complexity, which is necessary to minimize computational time.…”
Section: Our Proposed Hybrid Hh‐ss Optimization Algorithmmentioning
confidence: 99%
“…Recently, several metaheuristic algorithms have been devised to solve different optimization problems for instance, Salp Swarm Algorithm [6], Grasshopper Optimization Algorithm [7], Polar Bear Algorithm (Dawid Połap 2017) [8], Coyote Optimization ( Juliano Pierezan and Leandro dos Santos Coelho 2018) [9], and Two Cored Flower Pollination Algorithm (FPA) (Raza 2020) [10]. Devised hybrid modified FPA-Mix-integer linear programming (MILP) algorithm has been compared with these latest algorithms in terms of energy consumption costs, GHG emissions, and executions time for the devised energy management control in a residential µG.…”
Section: Introductionmentioning
confidence: 99%