In this study, the benefits of choosing a robust optimization function with super resolution are analyzed. For this purpose, the different optimizers are included in the simple Convolutional Neural Network (CNN) architecture SRNET, to reveal the performance of the each method. Findings of this research provides that Adam and Nadam optimizers are robust when compared to (Stochastic Gradient Descent) SGD, Adagrad, Adamax and RMSprop. After experimental simulations, we have achieved the 35.91 (dB)/0.9960 and 35.97 (dB)/0.9961 accuracy rates on Set5 images from Adam and Nadam optimizers, respectively (9-1-5 network structure and filter sizes 128 and 64). These results show that selected optimization function for the CNN model plays an important role in increasing the accuracy rate in the super-resolution problem.