Differential evolution (DE) is a simple and efficient population-based stochastic algorithm for solving global numerical optimization problems. DE largely depends on algorithm parameter values and search strategy. Knowledge on how to tune the best values of these parameters is scarce. This paper aims to present a consistent methodology for tuning optimal parameters. At the heart of the methodology is the use of an artificial neural network (ANN) that learns to draw links between the algorithm performance and parameter values. To do so, first, a data-set is generated and normalized, then the ANN approach is performed, and finally, the best parameter values are extracted. The proposed method is evaluated on a set of 24 test problems from the Black-Box Optimization Benchmarking (BBOB) benchmark. Experimental results show that three distinct cases may arise with the application of this method. For each case, specifications about the procedure to follow are given. Finally, a comparison with four tuning rules is performed in order to verify and validate the proposed method’s performance. This study provides a thorough insight into optimal parameter tuning, which may be of great use for users.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.