This paper introduces AdaSwarm, a novel gradientfree optimizer which has similar or even better performance than the Adam optimizer adopted in neural networks. In order to support our proposed AdaSwarm, a novel Exponentially weighted Momentum Particle Swarm Optimizer (EMPSO), is proposed. The ability of AdaSwarm to tackle optimization problems is attributed to its capability to perform good gradient approximations. We show that, the gradient of any function, differentiable or not, can be approximated by using the parameters of EMPSO. This is a novel technique to simulate GD which lies at the boundary between numerical methods and swarm intelligence. Mathematical proofs of the gradient approximation produced are also provided. AdaSwarm competes closely with several state-ofthe-art (SOTA) optimizers. We also show that AdaSwarm is able to handle a variety of loss functions during backpropagation, including the maximum absolute error (MAE).
This paper introduces the application of the Exponentially Averaged Momentum Particle Swarm Optimization (EM-PSO) as a derivative-free optimizer for Neural Networks. It adopts PSO's major advantages such as search space exploration and higher robustness to local minima compared to gradient-descent optimizers such as Adam. Neural network based solvers endowed with gradient optimization are now being used to approximate solutions to Differential Equations. Here, we demonstrate the novelty of EM-PSO in approximating gradients and leveraging the property in solving the Schrödinger equation, for the Particle-in-a-Box problem. We also provide the optimal set of hyper-parameters supported by mathematical proofs, suited for our algorithm 1 .
We have adapted the use of exponentially averaged momentum in PSO to multi-objective optimization problems. The algorithm was built on top of SMPSO, a state-of-the-art MOO solver, and we present a novel mathematical analysis of constriction fairness. We extend this analysis to the use of momentum and propose rich alternatives of parameter sets which are theoretically sound. We call our proposed algorithm "Fairly Constricted PSO with Exponentially-Averaged Momentum" -FCPSO-em 1 2 3 .
I. PARTICLE SWARM OPTIMIZATIONA. Vanilla PSO Particle Swarm Optimization (PSO) was first proposed by Kennedy and Eberhart [1], [2] in 1995 as an attempt to model the behaviour of bird flocks. N particles are initialised at random positions/velocities in the search space, i th particle updates its trajectory according tov (t+1) i
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.