INTRODUCTIONArtificial Neural Networks (ANNs) which is a branch of artificial intelligence is one of the frequently used classification algorithm to solve data mining problems in real applications [1]. Training process has great importance for the success of ANNs and several learning algorithms are proposed by researchers in the literature. The Back-propagation learning algorithm is generally used for the training process of ANNs in the literature. But BP learning algorithm may fall into local minimum and converges slowly which is caused by the neuron saturation in the hidden layer [2]. Recently, optimization algorithms have been widely used for the learning process of ANNs instead of BP learning algorithm by researchers. Various optimization algorithms with a global search feature are used to improve the performance of ANNs from being stuck to local minima problem. In [3], tabu search algorithm is examined as an alternative to problematic backpropagation algorithm and when the results derived from seven test functions are investigated, it is found out that tabu search algorithm yielded significantly better results than backpropagation solutions. [4] investigated the performance of a variation of hill climbing algorithm on artificial neural network training and compared the results to the performance of simulated annealing and standard hill climbing algorithms. PSO algorithm that is adopted by [5] is one of the most important algorithms that have the interesting performance for training ANNs. Another study has been done by for training ANNs using Convexity Based Algorithm (CBA). The results of the study prove that the CBA fill in a critical gap in utilizing of the classification algorithms [6]. In [7], researchers adopted a hybrid approach that combine PSO algorithm with gravitational search algorithm in order to solve the ANNs training problem. The new proposed training method reduced the problem of falling in local minima and slow convergence. The adopted method was compared to the other approaches. The results showed that the proposed method outperformed the others [7]. [8] considered taking advantage of both local search and global search algorithms. They merged PSO with the back-propagation algorithm. The statistical results proved PSO as a robust algorithm for training ANNs. The authors compared another method using modified differential evolution algorithm in order to train ANNs. In [9], the researchers used an adaptive differential evolution algorithm to train ANNs. The aim of the calculation is to find optimal weights. Adopted algorithm compared with the other method for different classification problems [9]. (Das et al., 2014) [10] used PSO algorithm for training ANNs for solving the problem of equalization channel. The result showed that the proposed equalizer achieved better results than the fuzzy equalizers in all conditions [10].[11] used both PSO algorithm and another important nature inspired algorithm, ANN-Bee colony