“…Optimization algorithms (OAs) are offered as viable alternatives to gradient-based MLP training approaches in this regard. Several works have been published in the literature, including group search optimizer (GSO) [5], symbiotic organisms search (SOS) [6] algorithm, lightning search algorithm (LSA) [7], ant lion optimizer (ALO) [8], Krill herd algorithm (KHA) [9], grasshopper optimization algorithm (GOA) [10,11], artificial bee colony (ABC) [12], social spider optimization algorithm (SSO) [13], hybrid of ABC and dragonfly algorithm (DA) [14], artificial ant colony optimization (ACO) [15], particle swarm optimization (PSO) [16], cuckoo search (CS) [17,18], moth-flame optimization (MFO) [19,20], whale optimization algorithm (WOA) [21], gray wolf optimizer (GWO) [22,23], black hole algorithm (BHA) [24], invasive weed optimization [25], multiverse optimizer algorithm (MOA) [26,27], bat algorithm [28], and salp swarm algorithm (SSA) [29].…”