Abstract. Differential evolution algorithms represent an efficient framework to solve complicated optimization tasks with many variables and complex constraints. Nevertheless, the classic differential evolution algorithm does not guarantee the convergence to the global minimum of the cost function. Therefore, the authors developed a modification of this algorithm that ensures asymptotic global convergence. The article provides a comparison of the ability to identify the global minimum of the cost function for the following three algorithms: the classic differential evolution algorithm, the above mentioned modified differential evolution algorithm and an algorithm of random sampling enhanced by a hill climbing procedure. We designed a series of numerical experiments to perform this comparison. The results indicate that the classic differential evolution algorithm is in general an extremely poor global optimizer (global minimum found in 2% of cases). On the other hand the performance of the modified differential evolution algorithm was considerably better (global minimum found in 83% of cases).